Thursday, November 19, 2009

Gana dinero con internet

Si tenés internet en tu casa y te interesa ganar un dinero extra por mes sin invertir nada, esta es una muy buena opción. Las PTC (pay to click) te pagan por ver publicidad, el proceso es muy simple, te registras y ves los anuncios durante 30 segundos y se va acumulando el dinero en tu cuenta, cuando llegas al mínimo requerido se puede solicitar el pago y te depositan el dinero en tu cuenta paypal o alertpay. Luego el dinero puedes utilizarlo para comprar muchas cosas por internet como por ejemplo cuentas GOLD de Fotolog, monedas en el Pet Society, comprar en EBAY, jugar al Poker, Casino online o cobrarlo en efectivo. Te recomiendo tener las dos cuentas tanto Paypal como Alertpay, ya que algunas paginas te pagan por paypal, mientras que otras solo lo hacen por Alertpay. El registro te lleva pocos minutos, no es necesario tener tarjeta de crédito y es la manera más segura de cobrar por Internet, para registrarte cliqueá en los banners al final de esta página.Aclaro que este no es el negocio de hacerse rico sin trabajar, pero si te anotas en varias páginas y le dedicas unos minutos en clickearlas todas al mismo tiempo, todos los y si conseguís algunos referidos, vas a generar un ingreso extra que nunca viene mal.
Eso sí, hay que tener en cuenta que muchas de las páginas que ofrecen dinero por ver publicidad son SCAM, o sea que no pagan, como por ejemplo AW Surveys, Bestbux.info, y las del admin Sandokan que es muy mala persona (Fastmoneybux.es diamondspanish.es) RICH PTC y todas las que dicen pagar 1 dolar por click; no pierdas tu tiempo y registrate solo en PTC seguras, todas las de esta lista son páginas que sí pagan.

Bueno comencemos, como primera medida, debemos tener nuestra cuenta Paypal y Alertpay para poder cobrar nuestro trabajo de ver publicidades. Para ello, nos registramos haciendo click en los siguientes

www.paypal.es o www.alertpay.com

nota: durante su registro elegir la opcion Personal de paypal, y no elegir la misma contraseña que las Ptc; asi no corren riesgo que les roben el dinero.

Una vez que tengas tu cuenta Paypal y Alertpay, te registras en todas las paginas PTC (pay to click) que quieras, obviamente mientras en mas paginas estes registrado haciendo clicks mas dinero ganaras.

te recomiendo las siguientes.

http://www.neobux.com/?r=fejasquindio
http://10bux.net/?r=antares_123
http://www.livebux.com/?r=antares_123
http://www.angelbux.com/?r=antares_123
http://www.adshow.eu/?r=antares
http://www.faranume.org/?r=antares
http://www.domptc.com/register.php?r=antares
http://buxchristmas.com/register.php?r=antares
http://www.leepubli.com/pages/index.php?refid=antares123
http://bux.to/?r=antares_123
http://www.jmgold.com/?ref=clariza
http://www.srsbux.com/?ref=antares
http://www.onbux.com/?r=antares
http://www.888bux.com/?r=antares
http://twitbux.com/?r=antares


Te registras y comienzas a clickear en la publicidad, al principio no ganas mucho, pero podes aumentar tus ingresos hacioendo referidos.

Buena suerte.

Friday, September 4, 2009

Catbird Launches vCompliance: A Comprehensive Compliance Monitoring and Enforcement Service for Virtual and Cloud Environments

Catbird, the pioneer in leading-edge security and compliance solutions for virtual and cloud data centers, today announced the immediate availability of vCompliance™, the industry’s first comprehensive automated monitoring and enforcement solution that ensures security and regulatory compliance for virtualized and cloud-based data centers. Based on Catbird’s vSecurity® platform, vCompliance’s real-time continuous monitoring service instantly detects compliance violations and quarantines offending assets to ensure ongoing compliance with leading regulatory standards, such as DIACAP, SOX, HIPAA and PCI.

With the increasing presence of VMware virtualization in regulated industries such as financial services, government, retail and healthcare, comes the attendant need to ensure that these deployments meet or exceed existing compliance requirements. vCompliance is designed specifically for these environments. It unites an automated, 24x7, network monitoring service with information from the hypervisor, including a full vulnerability management solution and network access control. These critical services are mandated by regulatory specifications. Catbird vCompliance provides numerous controls required by the leading regulatory standards organizations and the most common security frameworks.

“Maintaining compliance is the #1 concern facing security officers at enterprises moving to virtualized data centers,” said Edmundo Costa, Catbird COO. “Most regulatory bodies are beginning to update their specs to incorporate virtualization, and the absence of a single authoritative resource to guide IT security and operations professionals in achieving compliance has led to spotty enforcement and glaring gaps. vCompliance fills this void to ensure worry-free ongoing compliance, complete with automatic updates to the latest standards.”

“Virtualization is becoming increasingly present in datacenters where compliance is non-negotiable,” said Shekar Ayyar, vice president of infrastructure alliances, VMware. “To meet this growing demand, VMware works with partners such as Catbird to enable complementary solutions that take advantage of VMware virtualization to ensure compliance in a simplified, cost-effective yet more comprehensive way — providing customers with even better compliance than they could achieve with purely physical infrastructure.”

Finally, Compliance that Enhances Virtualization’s Appeal

Catbird vCompliance can actually make virtualized assets more compliant than physical, providing yet another incentive for businesses to move to cloud-based and virtualized environments, in addition to the inherent cost savings and provisioning ease. Designed for deployment as a cloud-based service, vCompliance provides a comprehensive and integrated virtualized compliance product, including:
Continuous and automated audit and enforcement to meet the requirements of PCI, HIPAA, SOX, DIACAP, etc.
Policy driven automated controls for auditing, inventory management, configuration management, change management, access control, vulnerability management, and incident response
Vulnerability scanning from inside the virtual subnets, with 100% visibility of all virtual machines
Automated enforcement and quarantine of out of compliance assets
Detailed statistics on compliance status for each individual asset, zone, virtual host or physical host
Automated, customizable reports (per compliance specification, per asset or per zone) geared for the appropriate organizational audience (management, operations, etc.) that provide a quick overview or deep-dive to help resolve compliance issues, ease remediation and restore full compliance
Web-based management

Integrated Compliance Management & Reporting Through VMware vSphere™ 4 and VMware vCenter™ Product Family

Via Catbird’s integrated security control console, Catbird vCompliance provides a compliance view of virtual systems in the vSphere management application. This integration brings real-time visibility and management of virtual and cloud security to a single web-based management interface, accessible from anywhere. The drag-and-drop dashboard supports automated discovery and hyperlink drill-down for intuitive management and ease of use. The multi-tenant portal provides more flexible administration options to meet the needs of multi-departmental organizations and service providers.

Seamlessly incorporated into the VMware vSphere™ 4 and VMware vCenter™ management workflow, VMware administrators can instantly monitor system compliance against standard or customized policies defined by corporate security or governance. Continuous status updates are delivered to the VMware vCenter console by “Catbirds” – stateless, non-invasive appliances deployed on the virtual subnets which act as the eyes and ears of the virtual environment.

In the event of an attempted policy violation, designated personnel are instantly alerted via an array of mechanisms and the offending activity blocked -- preventing a compromise of the integrity and compliance of the system. Detailed reporting, accessible directly from the console, provides automated event logs to ease troubleshooting and remediation.

Superbly Compatible for Exceptional Performance, Reliability

vCompliance is specifically designed to enhance overall data center performance with a minimal footprint and a flexible, fully XML and Web-services based architecture that allows for seamless integration with 3rd party reporting tools, trouble-ticketing and other enterprise-class services in a heterogeneous environment. In addition, Catbird is integrated with McAfee’s ePolicy Orchestrator (ePO) management console, providing customers with a single administrative interface for both virtual and physical security, and both network and end-point security.

"Our government and enterprise clients face regulatory pressure everyday from external and internal auditors," noted Adnan Hindi, COO of Catbird partner Palladian Technology. "With Catbird vCompliance, these customers can seamlessly deploy virtualization technology without compromising any of their established policies or 3rd party regulations, enabling a quick and easy shift to their virtualized data centers. We haven't seen any other product in the market with the depth and breadth of coverage that Catbird has."

“Catbird’s underlying technology is the most dependable and respected in the industry, relied on by hundreds of customers across some of the country’s largest Managed Security Services Providers to provide security and compliance for their customers,” Costa said. “Given the high cost of audits and financial damage that can result from a breach, ensuring ongoing compliance with an easy-to-deploy mechanism such as vCompliance can deliver a considerable ROI, along with peace of mind.”

Hardware Server Sales Slump as IT Pros Capitalize on Current Capacity and Add Virtualization and Consolidation to Get Through 2009 – New Research From

TheInfoPro, an independent research company for the IT industry, today released new real-time data from its pending Server Study (final results due in October, Q3 2009) indicating that more than 50% of new servers being installed in 2009 will host virtualization, and future progressive growth indicates 80% by 2012.

Announced at VMworld 2009, TheInfoPro’s Server Study has conducted interviews with 195 IT pros wherein initial spending data indicates that 22% expect increases this year in server spending, but an additional 34% of the group indicates they’ll experience decreases. The IT pros range from Fortune 1000 (F1000) organizations to midsize enterprises (MSEs) in North America and Europe, and the interviews were completed between June and August 2009.

TheInfoPro Server Study also captures data about the rise of desktop virtualization, offers highlights on spending in each area of server management by vendor, and gives an in-depth look at VMware’s benefits and challenges in the current IT marketplace. For more information, visit TheInfoPro at booth #1322 or call the contact number below for a real-time briefing.

Hardware vs. Software Spending

TheInfoPro’s network of IT pros stated that virtualization and consolidation is a critical lifeline to optimizing the current capacity of their existing physical servers. Virtualization deployments will continue to expand in the coming months, with 70% of the respondents citing it as critical to meeting their business objectives. Though hardware spending continues to show little growth, more than 50% of respondents do expect to resume hardware acquisition once the economy stabilizes. For now, Hewlett-Packard is positioned as a strong vendor in future choice of spending and faces the lowest vulnerability to customer switching when compared to competing vendors.

In software, despite the revenue-dampening effect of enterprise licensing, Microsoft remained steady in its category, with 22% of respondents indicating they would spend more throughout the remainder of 2009. VMware and Red Hat remain strong in their respective categories, with 41% and 30% spending more in 2009, respectively.

“While virtualization does show continued growth in 2009, the depth of this penetration will really depend upon the ability to manage and deploy applications in these complex environments,” said Bob Gill, TheInfoPro’s Managing Director of Server Research. “The 2010 outlook and beyond are good indicators that even as IT budgets continue to rise, hardware spending will be looked at differently given the options of virtualization and consolidation.”

Virtustream Secures $25 Million Equity Financing

Virtustream, Inc. a privately held infrastructure services firm, today announced the completion of new equity financing with total commitments of $25 million led by Virginia based Columbia Capital and Miami based Blue Lagoon Capital.

Virtustream is an infrastructure services firm committed to helping clients actualize the enterprise cloud by providing strategy, integration and managed services leveraging deep virtualization experience, and its own proprietary platform. The infusion of funding is part of Virtustream’s growth strategy and will be used to continue accelerating its platform development, build international presence and expand domain expertise.

“Columbia Capital is excited to work with this team of experienced virtualization and enterprise transformation professionals,” said Patrick Hendy, Partner at Columbia Capital. “We are very impressed with Virtustream’s proven management team and feel that they are well positioned to take the lead in the enterprise cloud computing space.”

“Virtustream has deep virtualization experience, breadth & depth of highly scalable services, strategic alliances with key hardware and software partners as well as unique business and pricing models making them a compelling alternative to larger firms,” said Rodney Rogers, CEO of Blue Lagoon Capital.

Virtustream also recently completed the acquisition of VirtualizeIT, the premier European consultancy practice dedicated to the advancement and adoption of virtualization technologies, and Brigh Technologies (Brigh), the leading North American provider for the design and deployment of custom solutions in the virtualization space. The union brings together pioneers in enterprise computing including being one of the first VMware professional services partners in North America and the 2008 VMware EMEA Partner of the Year. The stockholders of both acquired companies will have an equity interest in Virtustream, and all employees will continue their employment with Virtustream.

“Virtustream’s strategy is to fold VirtualizeIT and Brigh’s recognized virtualization services leadership into our existing capabilities to continue to build out our proprietary platform,” said Kevin Reid, CEO of Virtustream. “The platform will harness years of experience gained from thousands of successful physical to virtual transformations performed by both entities, leveraging the tools and historical data collected from those environments.”

“VirtualizeIT and Brigh’s clients recognize the complexity of being efficient in a virtualized environment and we now have the collaborative ability to address additional requirements. Typically, when companies undergo some transformation exercise to move from a physical to a virtual world they traditionally see annual savings ranging from 30 to 50%. By adding the Infrastructure Services that Virtustream provides to manage that platform, our clients can potentially yield an additional 10 to 20% benefit. Our Infrastructure as a Service offering delivers unique, industry leading resource level Service Level Agreements,” added Reid.

As a silver sponsor, Virtustream will be showcasing their solutions in booth #2314 at the VMworld 2009 exhibition in San Francisco at The Moscone Center from August 31 through September 3. Key leadership will be available to brief media and analysts on the firm’s differentiators in the enterprise cloud space.

Vizioncore Offers vConverter SC As Freeware

Vizioncore Inc., the market leader in virtualization data protection and management solutions, today announced that vConverter SC (Server Consolidation) will now be available for free. Customers may use the tool to accelerate towards a virtualized environment, making the most of maintenance windows and reducing downtime on current physical machines.

vConverter SC offers an intuitive GUI to enable rapid simultaneous conversions of single or multiple servers without disruption to the source system. By combining cutting-edge disk and networking technology with extensive automation of pre- and post-conversion tasks, it boasts the ability to successfully migrate more servers per conversion window, saving valuable maintenance time. Furthermore the product is compatible with hypervisors from leading vendors such as VMware, Microsoft and Citrix, making it the perfect fit for almost every environment.

The move continues the precedent set at VMworld Europe earlier this year with the launch of the Virtualization EcoShell Initiative (VESI), another free offering from Vizioncore to the virtualization community. In providing these tools and services, the company is cementing its intention of offering a selection of the Vizioncore product portfolio for free, thereby widening adoption of virtualization across the market.

“We’re excited by the prospect of taking Vizioncore to a wider audience and by lowering the barriers to adoption,” said Devang Kothari, Product Manager for vConverter, Vizioncore. “We hope that end users looking to take the first steps into managing their virtualized estates will benefit from seeing how Vizioncore products help overcome their pain points and then be encouraged to deploy our comprehensive suite of virtualization management products.”

vConverter SC can be downloaded for free here. At VMworld 2009, visitors are invited see vConverter SC in action alongside other Vizioncore offerings at booth #1402.

Monday, March 9, 2009

New AppTitude 3.5 Adds Support for Citrix XenApp Virtualization To Leading Application Compatibility Testing Solution

AppDNA, the worldwide leader in compatibility testing for application environments, today announced the general availability of its AppTitude 3.5 solution. The new release adds support for the Citrix® XenApp™ application delivery solution for both hosted and streamed applications. A beta for the XenApp support was introduced at the Citrix Summit event in October 2008 and was tested by a wide range of Citrix customers and partners. AppTitude 3.5 provides business benefits that include vastly reducing the time, cost and risks associated with deployment of OS migrations, software upgrades, virtualization and other changes.

What’s New in AppTitude 3.5

The new release of AppTitude™ allows organizations to check for application compatibility with XenApp (versions 3.0 through 5) and Windows Terminal Services. The new release has two separate reporting modules – the Server-Based Computing (SBC) Module for compatibility with XenApp hosted and Terminal Services configurations and the Virtualization Manager for XenApp streaming. AppTitude 3.5 supports two types of analysis – static and runtime – including an extended runtime analysis that can import customized OS images. This allows the AppTitude algorithms to adapt to the build configuration of the target OS platforms, which generate more accurate and relevant reporting for those customers with customized applications. It also offers new performance and security analyses that are essential for assessing application compatibility for XenApp and Terminal Services environments. A new 64-bit report in the Server and Desktop Compatibility Manager provides a dedicated view on application suitability for 64-bit computing. New support for external data source lookups enables third party and vendor databases, such as Microsoft’s list of certified Vista-compatible applications, to supplement AppTitude’s direct analysis.

“Companies are looking to application virtualization and XenApp to reduce the cost and complexity of managing and delivering applications,” said Bill Hartwick, Senior Director of Product Marketing at Citrix. “AppTitude 3.5 can help identify up front what changes, if any, need to be made to the application to deliver it with XenApp. By providing rapid and comprehensive visibility into their potential application compatibility issues, AppTitude 3.5 enables our partners to help customers more quickly and cost-effectively manage their applications in our virtualization environment.”

“With AppTitude 3.5, we can now provide unprecedented insight into application DNA for those organizations delivering applications with XenApp,” said Mike Welling, CEO of AppDNA. “With its extended runtime analysis and enhanced reporting capabilities, AppTitude 3.5 provides more accurate reporting and an even more powerful analytic dashboard for dramatically improving XenApp rollouts.”

Pricing and Availability

AppTitude 3.5 for Windows platforms and AppTitude Virtualization Manager for Citrix XenApp are available immediately from AppDNA. AppTitude pricing varies based on configuration, ranging from $5,000 for a typical pilot installation to a starting range of $50,000 for enterprise-wide deployments.

Red Hat Moves to Expand Server Virtualization Interoperability

Today we made an announcement that I think is going to generate a lot of interest. Red Hat and Microsoft are working together to ensure virtualization interoperability. This is big news. Companies deploy virtualization to make their infrastructure more efficient. By allowing Windows Server to run as a guest on Red Hat virtualization, and Red Hat Enterprise Linux to run as a guest on Windows virtualization, customers gain new level of compatibility, interoperability and support. This is a major step forward for the industry.

Both Microsoft and Red Hat now have the capability to provide complete end-to-end virtualization solutions, from hardware to operating system, on the two industry-leading operating environments, which IDC says represent about 80 percent of today’s virtualized operating systems. This breaks through a major hurdle to more widespread adoption of virtualization.

Of course, it is also big news because it is rare that these two companies publicly work together. The companies continue to compete vigorously. But virtualization interoperability is very high on customers’ wish lists, and I’m pleased both companies have been able to respond in this cooperative fashion.

But for the record, it isn’t the first time Red Hat and Microsoft have cooperated. For example, Microsoft has recently joined the open source AMQP high performance messaging project, of which Red Hat was a founding member. Red Hat customers are already deploying AMQP technology with Red Hat’s Enterprise MRG product. The messaging element (the “M” in MRG) provides messaging up to 100 times faster than some legacy technologies.

One of the big questions on the minds of many members of the open source community is whether Red Hat has compromised its ideals. Nothing could be further from the truth. Red Hat’s growth, and its differentiation, come from its belief in and commitment to, the open source community model. It is our view – and this view is institutionalized throughout our company – that we have to serve the community, as well as our customers, shareholders, and employees. The moment we stop doing so, we eliminate the differentiation which drives our growth.

So we undertook this interoperability effort with strict adherence to our principles. The companies signed two agreements: One in which Red Hat joined the Microsoft Server Virtualization Validation Program (SVVP), which validates Windows Server guests running on Red Hat Enterprise virtualization technologies, and the other which certifies Red Hat Enterprise Linux guests running on Windows Server Hyper-V.

The agreements contain no patent or open source license components. There are no financial clauses beyond simple certification testing fees. These are straightforward certification and validation agreements.

I am excited about this step forward for the industry. And I am pleased we did it without compromising our commitment to open source. That’s leadership we can be proud of.

Apple App Store's New Rival: Jailbroken Paid Apps

cydia apple iphone app store jailbrokenWith more 1.5 million jailbroken iPhones out there by some estimates, a new alternative to Apple's App store launched over the weekend to great interest in the iPhone community. Cydia Store, the de-facto app store for jailbroken iPhones, now offers paid apps, ending Apple's monopoly.

If jailbreaking your iPhone is not a big price to pay (besides losing your Apple warranty) in order to get all those cool applications that Apple deemed as unauthorized or kicked out of its App Store, then maybe you should have a look at the new version of the Cydia Store, which now features paid apps as well.

By now, Cydia allowed jailbroken iPhone users to install countless free (read: unauthorized) third-party apps, welcoming a growing community of developers that got their applications rejected from Apple's official store. The latest update to Cydia, released on Sunday, now includes a full-fledged app store, together with payment processing -- basically competing with Apple's solution, but on the grayer realm of legality.

It's clearly worth mentioning that Apple does not endorse jailbreaking or hacking into the iPhone's OS, allowing installation of third-party apps that were not approved by the Cupertino company. Actually, Apple tries to convene that jailbreaking your iPhone is a violation of copyright laws. However, hacking your iPhone proved to be a popular practice among many users.

app store apple cydia iphoneCydia brings for free to jailbroken iPhones highly requested features like copy/paste, camcorder possibility, or tethering options (iPhone as a modem). The new version of the Cydia app store, now supporting paid for applications opens a new world for those iPhone developers whose Apps were rejected by the official Apple App store. The first paid app in Cydia is a contacts application that puts contact photos alongside names and costs $1.

Nevertheless, the Cydia Store has downsides as well. Besides the main inconvenience of having to jailbreak your phone (which can be achieved easily these days), the store accepts payments only via Amazon Payment accounts (but a recent Twitter post from the store's creator, Jay Freeman, says PayPal payment is coming soon). Also, the store accepts only a limited number of app submissions at a time (submissions are halted now), highlighting the limited personnel to handle approvals.

But Cydia Store is certainly an interesting development in the iPhone world to watch over the coming years. The iPhone still lacks some features that many of its users crave for and it looks like plenty will try and get them, regardless of the legal uncertaintly they tackle in the process. And even though Apple tries to block jailbreakings with every iteration of the iPhone's software update, a few days later a new hack makes its way on the Internet.

System trouble halts Japanese weather data

A computer problem halted the flow of weather data from the Japan Meteorological Agency to media organizations and other users today.

The cause of the problem, which occurred around 3 a.m. local time (6 p.m. GMT) in systems of the Japan Meteorological Business Support Center, is unknown, and the system hadn't been brought back online as of 1 p.m. local time.

As a result of the problem, the Meteorological Agency and many news organizations haven't been unable to publish current weather data and weather forecasts. The most up-to-date forecast on the agency's home page is that from 11 p.m. local time, Sunday night. On a typical Monday, it would have been updated at least twice before lunchtime.

Windows Server will Run in Enterprise Cloud

Future versions of Windows Server will enable companies to efficiently manage and provide virtualized applications through the Web just like Microsoft Corp.'s upcoming platform-as-a-service, Windows Azure, a company executive said this week.

"The innovation in Azure and future versions of Windows Server will be shared, and that code base will continue to cross-pollinate," said Steven Martin, senior director for developer platform product management at Microsoft, in an interview. "The corporate data center at some point in time will look like a mini-cloud, partitioned by application workload."

First previewed last fall, Windows Azure is Microsoft's foray into bringing Windows Server online as a cloud computing platform. Developers will be able to port or write applications using Microsoft's popular .Net tools and Web standard interfaces such as REST, SOAP and Atom, and host them on Azure, similar to Amazon.com Inc.'s EC2, Salesforce.com's Force.com, or Google Inc.'s App Engine.

Azure is expected to be released later this year. Detailed pricing hasn't been released. Microsoft is expected to talk about Azure at its MIX Web development conference in Las Vegas next week.

Conventional hosting entails companies buying or leasing a server from a data center operator and running a set number of applications off it. That can be complicated to manage, entail a lot of upfront cost, and can be difficult to scale quickly on demand.

Azure, like other newer-generation cloud platforms, enables faster setup and easier scaling, and lets users pay for usage, thus avoiding upfront investment.

"Our goal is to completely hide the complexity of hardware from developers," Martin said.

Martin mentioned several Azure beta testers. One, a company called S3Edge, helps manufacturers recall defective products.

"Ideally, a product doesn't get recalled and they don't need to activate our service," Martin said. "But if it does, they need to be prepared to scale very fast."

An independent software vendor, Epicor Software Corp., is writing the next version of its ERP software so it can be hosted via Azure, Martin said, while another, Micro Focus, is taking Cobol applications off a mainframe and hosting them on Azure (as well as Amazon.com Inc.'s EC2) for its customers.

Azure runs on Windows Server 2008 inside Microsoft's data centers. The fact that Microsoft offers both Windows Server software and the Azure service as part of its "software+services" strategy, is a plus for companies unsure about committing completely to a cloud infrastructure, Martin said, whether because they think they can run it cheaper or with more agility, or because regulations require them to do so.

"We make it really easy for you to transition back to on-premises without having to completely rewrite your app. You control your own destiny," Martin said. By contrast, "if I'm a startup, it's gotta be in the back of my mind when I look at Amazon.com's 10-K, that 'Gosh, they may want to go back to just selling books.'"

Besides corporations, Web hosting companies may be interested in hosting Azure to make their infrastructure more nimble and efficient. Martin said hosting companies and other application service providers won't get access to Azure before enterprises, though.

Server Vendors Stung by Falling Sales

If companies around the world are freezing technology spending, then it's the server market that appears to be bearing the brunt of the financial chill.

According to the latest Gartner figures, worldwide server shipments declined by 11.7 percent in the fourth quarter of 2008 compared to the same period a year earlier, which added up to a nasty 15.1 percent contraction in overall revenue.

All vendors and most market segments posted declines, but the strange geographical concoction known as EMEA (European, the Middle East and Africa) posted the worst figures, showing revenue declines of 20.6 percent, ahead of the US which dropped 14.6 percent. Only Japan posted a rise, reflecting the fact that the worst of the slump had yet to reach the Far East.

Unix server shipments were down 10.5 percent, which sounds bad but was still less severe than the 11.4 percent experienced by the more mainstream x86 market. The single bright spot for server technology was blade servers which posted gains in a year, 2008, Gartner still predicts revenues will have increased by 30 percent.

As to vendors, IBM was the biggest loser, down 22.4 percent in server shipment numbers, with Dell (-7.1 percent), Sun (-3.9 percent), and HP (-1.6 percent) less affected. It is worth pointing out, however, that shipments don't tell the whole story; all of the above saw substantial falls in server revenue, which suggests that some have kept shipment numbers up by cutting prices.

"The weakening economic environment had a deep impact on server market revenues in the fourth quarter as companies put a hold on spending across most market segments," said Gartner's senior research analyst, Heeral Kota, perhaps stating the obvious. "Almost all segments exhibited similar behaviour as users sought to reduce costs and spending, deferring projects where possible."

What the figures underscore is that US spending has contracted sharply, while spending in Europe and other parts of the world has also dropped off sharply, not helped by fluctuating currency rates which can cause prices to rise.

The next year won't be much better. "The continued weak economic environment will cause users to be extremely cautious with levels of expenditure which will make for a particularly challenging environment for vendors. The server market already has high levels of vendor consolidation but the conditions expected during 2009 will increase the threat of further consolidation," said Kota.

Seagate, AMD Showcase Super Fast SATA Drive

As the year marches on, work proceeds apace on the next big step in the Serial ATA specification. Alternately called SATA Revision (or just Rev) 3 or SATA 6Gbps, the updated specification was finalized late last year. This week at the FOSE '09 government tech show in New Orleans, Seagate is becoming the first hard drive manufacturer to publically demonstrate a SATA 6Gbps hard drive. Seagate's technology demo is in partnership with AMD, which has supplied the necessary chipsets to achieve the third-generation SATA interface's fast speeds.

The SATA spec bump is a natural evolution. Notes Seagate's Marc Norblitt, "We need to make the interface faster, so the interface doesn't become a bottleneck that causes performance to suffer dramatically. The higher the capacity of the drive, the higher the areal density; the higher the areal density, the more bits you get under the head in the same amount of time." That, Norblitt adds, translates into data being output faster.

The read speeds that Seagate has achieved, and will be demonstrating at the show, are about 550 megabytes per second (including command overhead). By comparison, SATA 1.5 achieved 120Mbps, and SATA 3 achieved 250Mbps. The demonstration uses an AMD reference motherboard, with an AMD SATA 6Gbps chipset and CPU, and a prototype 6Gbps drive. The drive uses the same SATA connectors as current-generation SATA drives, and is backward compatible with earlier SATA versions.

Although SATA 6Gbps will be here by year's end, Norblitt says he doesn't expect the technology to be needed for another two years. When such drives do ship, they will provide a future-proofed way for individuals to plan ahead. Norblitt expects Seagate will have a SATA 6Gbps drive to market in "late 2009." The company expects to focus on placing the drives in high performance PCs, gaming PCs, and low-end server PCs. -targeting channel. "We're targeting customers who want high capacity, high performance disc drives," explains Norblitt.

Among the big improvements for SATA 6Gbps: Better power management, and improved native command queuing. With regard to power management, the new spec gives more control to the host or device. Instead of shutting the interface off, it allows it to into a slumber mode, one that's initiated by either the device or the host. The updated native command queuing allows streaming commands. So how will SATA 6Gbps stack up against the other forthcoming interface speed bump, USB 3.0? "SATA is a storage interface; USB is a universal interface," Norblitt says simply. The two interfaces, he adds, will be able to co-exist.

Oracle Offers Sourcing Software as a Service

Oracle on Monday took a step in the direction of on-demand ERP with the announcement of Oracle Sourcing On Demand, a SaaS module for handling the purchase of supplies and services.

The new product is based on components in Oracle's flagship on-premise ERP (enterprise resource planning) system, E-Business Suite Release 12.

The software allows various parties involved in the sourcing process to collaborate on decisions, ensuring the best deals are struck, Oracle said. It also "meets the highest security standards," includes packaged integrations with applications like Oracle Purchasing, and provides standard tech support as well as a help desk. Pricing information wasn't immediately available.

In providing targeted SaaS (software as a service) offerings along with traditional ERP systems, Oracle is following a path similar to its rival SAP, which already had an on-demand sourcing application, and is expected to announce a number of additional SaaS services later this year.

While Oracle has sold on-demand CRM (customer relationship management) software for some time, it makes sense for it to start with sourcing on the SaaS ERP side, since a number of specialized companies have already proven there's a viable market, said 451 Group analyst China Martens.

Other on-demand sourcing and procurement vendors include PowerAdvocate, Ketera and Coupa, the last of which was formed by former Oracle employees.

Companies that are setting up shop in a new country, for example, might be attracted to the SaaS model for sourcing, since they could get it up and running more quickly, Martens said.

"It's definitely an area we've seen lend itself to the SaaS model," she added. "It's a cool thing for Oracle to do here. ... For Oracle, the question now is do you look at other pieces of ERP [for on-demand]?"

Lifeboat Distribution Offers New Virtual Iron and DataCore Virtualization Solution Bundle

DataCore and Virtual Iron "bundle" available through software distributor Lifeboat

Florida DataCore Software, the leading provider of storage virtualization, business continuity and disaster recovery software solutions, and Virtual Iron Software, Inc., a provider of enterprise-class server virtualization and virtual infrastructure management software, today announced that distributor Lifeboat Distribution has unveiled a new promotion for their resellers when they purchase DataCore and Virtual Iron licenses together. Lifeboat will provide their resellers with additional margin and marketing support for selling the new bundle. Lifeboat Distribution is an international specialty software distributor for virtualization and other technically sophisticated products. For more details, resellers should contact datacore@lifeboatdistribution.com.

Resellers now have more cost-effective alternative
Services and solutions provider TelosIT, Inc. already delivers solutions that combine DataCore with Virtual Iron. "Lifeboat’s decision to offer additional margin for using DataCore in conjunction with Virtual Iron is something we will leverage," said Kevin Carlson, CTO, TelosIT. "This represents a very cost-effective option for resellers and the resellers taking advantage of this opportunity will soon see just what high performance solutions are possible when combining these two robust offerings."

The DataCore and Virtual Iron combination benefits customers – virtual storage and virtual servers go hand-in-hand
Solutions provider TelosIT has already enabled Matrix Design Group, an award-winning interdisciplinary firm providing professional engineering consulting, including program management and client representation, for both the private and public sectors, to expand capacity and achieve dual-site data redundancy, while saving tens of thousands on lease renewals on its previous generation SAN with DataCore and Virtual Iron. "With DataCore and Virtual Iron in place both administration and management were simplified greatly and I was able to further reduce my IT spending." said Eric W. Smith, vice president, Matrix Design Group.

About Lifeboat Distribution
Lifeboat Distribution, a subsidiary of Wayside Technology Group, Inc. (NASDAQ: WSTG), is an international specialty software distributor for virtualization, security, application and network infrastructure, business continuity/disaster recovery, database infrastructure and management, application lifecycle management, science/engineering, and other technically sophisticated products. The company helps software publishers recruit and build multinational solution provider networks, power their networks, and drive incremental sales revenues that complement existing sales channels. Lifeboat Distribution services thousands of solution providers, VARs, systems integrators, corporate resellers, and consultants worldwide, helping them power a rich opportunity stream, expand their margin+ services revenues, and build profitable product and service businesses. For more information, visit www.lifeboatdistribution.com, or call +1.800.847.7078 or +1.732.389.0037.

About DataCore Software
DataCore Software, the leading provider of storage virtualization SAN software, fundamentally changes the economics of managing storage with innovative software that combines advanced functions and services with the agility and savings of hardware independence. DataCore lowers the cost and complexity of IT by making storage efficient, fast, flexible, fail-safe and virtual. DataCore's portable storage server software simplifies and automates capacity expansion and centralizes storage management for Windows, UNIX, Linux, MacOS, NetWare, VMware and other leading open system and virtual server platforms. DataCore is privately held and its corporate headquarters are in Ft. Lauderdale, Florida. For more information, call (877) 780-5111 or visit www.datacore.com.

DataCore, the DataCore logo and SANmelody are trademarks or registered trademarks of DataCore Software Corporation. Other DataCore product or service names or logos referenced herein are trademarks of DataCore Software Corporation. All other products, services and company names mentioned herein may be trademarks of their respective owners.

About Virtual Iron Software, Inc. – True Server Virtualization for Everyone
Virtual Iron provides server virtualization software that reduces the cost and complexity of operating and managing IT infrastructure for organizations of all sizes. Leveraging industry standards, open source, and built-in hardware-assisted acceleration, Virtual Iron provides a complete and cost-effective solution including VI-Center, an intelligent virtual infrastructure management platform. Over 2,000 customer organizations worldwide leverage Virtual Iron today to support a broad range of data center initiatives including server consolidation, virtual server management, dev/test optimization, business continuity and virtual desktop infrastructure (VDI) enablement. The software is available exclusively through Virtual Iron's Channel One partner network. Trial versions of the software are also available for free download at www.virtualiron.com/free. For more information, visit www.virtualiron.com or e-mail info@virtualiron.com.

Tuesday, March 3, 2009

Asus' Dual Panel Touchscreen PC Concept

Things were already a bit touchy-feely at the Asus booth at CeBit with the company’s EeePC T91GO. Fighting for the spotlight, there’s the Dual Panel touchscreen PC.

Despite being “just a concept,” the notebook is impressive none the less. The display models were labeled with some anti-social “do not touch” signs but according to Engadget, the models on hand were running Windows 7 and the onscreen keyboard looked pretty decent.

Credit: EngadgetWhat’s your take on the touch screen craze going on at the moment? We’ve already seen two netbook tablets this week and now we’re seeing this concept model of a Dual Panel touchscreen PC. We’re all for notebooks (or netbooks) that convert into tablets but we’re not sure we like this idea of giving up our keyboards in favor of a virtual one; and for these to be in any way successful, the pricing will need to be pretty competitive.

What do you reckon -- you into it?

Layoff & Hiring News for IPhone

Few headlines are as depressing as those announcing the yet another round of layoffs from yet another struggling company. On Friday, February 27, for example, Pilgrim's Pride announced it would close three chicken-processing plants, shedding 3,000 jobs. Alliant Techsystems announced it would cut 300 jobs. Mastercraft Boats said at least 110 workers would be let go.

In all, I read about 22 companies or government agencies planning to shed employees through Santhi Rudraraju's Layoff & Hiring News--007 app for the iPhone and iPod touch. The app, which works with Wi-Fi, 3G, or EDGe connection, feeding a steady flow of headlines from the full-featured Layoff Daily Web site.

Layoff's simple but drab interface befits these austere times. The app is not much different from any other garden-variety RSS reader and is fairly easy to navigate. Tap on a headline, and Layoff will launch a browser window within the app. It features two days' worth of headlines, updated in real time, as well as a section on major corporate layoffs and news of companies that are actually hiring--the latter no doubt being an effort to stave off reader attrition by suicide.

The app can be slow to load headlines--a problem the developer seems to acknowledge with a "thanks for being patient" message that appears every time you tap a section on the menu bar at the bottom of the screen.

Bottom line: Layoff & Hiring News gives you all the bad news you can stomach (and some good) in a no-frills package. Xanax sold separately.

Layoff is compatible with any iPhone or iPod touch running the iPhone 2.2 software update.

[Ben Boychuk is a freelance writer and columnist in Rialto, Calif. Feel free to drop him a line.]

Fujitsu Siemens Computers Holding BV plans to launch in the middle of this year an enterprise desktop computer that consumes no energy when switched off, the company said Sunday at the CeBIT trade show in Hanover, Germany.

Computers, like most electronics, consume a small amount of energy even when switched off because of losses in the transformer or sensors that remain active for functions such as remote power-on. For a PC, the consumption when powered off is typically between 1 and 4 watts, said Fujitsu Siemens. The best that energy-conscious users can do is keep electronics on a power strip that they must remember to turn off.

The Esprimo 7935 packs a system that achieves zero consumption without pulling the plug, said Lothar Lechtenberg, a company spokesman.

Businesses with a lot of computers stand to save a significant amount of money each year by ensuring that their PCs aren't consuming power overnight, but there are disadvantages. Many companies administer software updates overnight, and having the machines unplugged means that wouldn't be possible.

Fujitsu Siemens said it has solved this problem by allowing the machines to be awake and consuming a very small amount of power during a predefined time slot during which updates can take place. Once the time slot passes, the machine returns to zero-watt mode until it is switched on by its user.

Other green credentials of the new computer include a power supply that is 89% efficient, which means less electricity is wasted through heat, and motherboards with no halogen or lead. The PC conforms to the U.S. Environmental Protection Agency's Energy Star 5.0 standard, which will come into use in the middle of this year, and the German Blue Angel mark.

The machine is likely to cost between $757 and $883 (€600 or €700). Availability outside of Fujitsu Siemens' Europe, the Middle East and Africa sales area was not announced.

Intel opens up the Atom processor to TSMC

Intel Corp. on Monday announced a partnership that could provide access to the chip design of its low-cost Atom processor to Taiwan Semiconductor Manufacturing Co.

The partnership with TSMC could lead to customized chips that could provide Intel access to new markets it can't reach alone, said Sean Maloney, Intel executive vice president and chief sales and marketing officer, during a conference call with reporters.

TSMC will be able to provide its customers with details of Atom's design so that they can design chips based on its core.

Atom chips currently go into low-cost laptops, also known as netbooks, and devices such as mobile Internet devices (MID) and smartphones. Future Atom chips will include more integrated PC capabilities, such as graphics and Internet connectivity, that could push the processor into embedded devices and consumer electronics.

To date, Intel has alone developed and sold its Atom processors for netbooks and MIDs. The company wants to maintain tight control over the types of products the derivative Atom chips will go inside, Maloney said. Intel will not be transferring Atom's manufacturing process technology to TSMC, so any chips that result from the deal will be manufactured by Intel.

"What we're doing here [is] picking the segments we go after," Maloney said.

The companies have collaborated for close to 20 years on products, including WiMax chips.

Intel officials shied away from answering questions about whether the TSMC deal would affect Atom's product road map or future smartphone chips like Moorestown. Details surrounding the deal are still being worked out, Intel officials said.

This agreement is similar to a strategy employed by Arm Holdings, which generates revenue by licensing smartphone and embedded chip designs to chip makers, said Jack Gold, principal analyst at J.Gold Associates. Arm has licensed its chip cores to companies such as Texas Instruments and Qualcomm, which provide chips for smartphones.

"This is a direct attack on competing processors, especially the Arm processor, which is trying to move upstream from phones and embedded gadgets, while Intel is trying to move downstream with Atom into this overlapping space. The battleground in the middle will be aggressive and potentially bloody, with huge potential returns," Gold wrote in a research note.

The partnership will help Intel add a revenue stream by licensing out its Atom core and adds "massive market potential" through TSMC's customers, Gold wrote. TSMC has connections to many consumer and lower-end products, like smartphones and embedded device markets, especially in Taiwan and Japan, Gold wrote.

The partnership is a win for both companies, said Rick Tsai, president and CEO of TSMC, during the call. It is mutually beneficial because it will allow both companies to generate additional revenue and reach new markets, especially at a time when the semiconductor industry is struggling.

"People in our industry must work together ... so we can share the benefits," Tsai said.

Intel has taken a number of steps to develop integrated chips that could fit into new products like set-top boxes and TVs. Intel in February said it was prioritizing its move from the 45-nanometer process to the new 32-nm process technology, which should help the company produce faster and more integrated chips.

To that end, the company said it would spend $7 billion over the next two years to revamp manufacturing plants. It will also help Intel make more chips at lower cost and add efficiencies to the production process. Intel will begin producing chips with 32-nm circuitry starting late this year.

HP Shuts Down Upline Online Storage Service

Hewlett-Packard Co. Monday said it has closed down its online backup service after less than a year of operation.

HP did not provide a reason for closing the service other than to say, "HP continually evaluates product lines and has decided to discontinue the HP Upline service on March 31, 2009."

Patricia Kinley, a spokeswoman for HP's Personal Systems Group, said the company stopped backing up files as of Feb. 26.

"HP will keep the file restore feature of the Upline service operational through March 31, 2009 ... in order for customers to download any files that have been backed up to Upline," she said in an e-mail response.

HP's Upline service had trouble from the start. Three weeks after opening in April last year, it went down for a week. Users at the time reported problems in the client software to upload and synchronize files with the hosted service -- calling Upline a good idea that was horribly executed.

The Upline service was among a number of subscription-based online backup models that emerged over the past couple of years, including EMC Corp.'s Mozy , Nirvanix , Carbonite , Symantec Corp.'s upcoming SwapDrive , ClunkClick and Robobak , and Yahoo Briefcase , which also announced it will be shutting down this month.

Like many of the other service offerings, HP had acquired the technology for Upline. HP bought start-up Opelin Inc. in November 2007 for its Titanize cloud-based file backup service.

HP's Upline service charged between $4.99 to $8.99 per month for unlimited online storage to home, family and professional users.

Sony's PSP Gets a New Look

Sony's redesigned PSP is rumored to be getting a new look. But all changes to this portable gaming device, expected to be released later this year, are largely cosmetic.

The biggest change to the PSP's design is a sliding screen; as shown in a mock-up from VG247, it slides up to reveal various controls that are hidden beneath it when closed. new PSP, dubbed the PSP 4000, may be "significantly smaller in width," because of the new design, Eurogamer says.

According to reports, the PSP 4000 will have to be in the open position to play full-featured games, but there's no word on whether the rumored design includes game controllers or a keyboard underneath the screen. The 4000 may also allow you to play basic games, like LocoRoco, using the shoulder buttons (the L and R buttons at the top of the device) when the screen is closed.

This latest rumor comes after last week's news that the PSP may let go of its UMD drive to offload more bulk from the game system. Instead of the disc drive, Sony may look to sell games through the online PlayStation Store or perhaps even on Sony Memory Sticks. If the rumors are true, then the 4000 is a significant step forward for the PSP; however, the new PSP will still be based on current PSP tech with no improved graphics or gaming features. That being said, with the PSP 3000 and these new rumors, it's refreshing to hear about PSP updates that go beyond new colors, various entertainment bundles and incremental firmware updates.

The rumored release date for the 4000 is late 2009, and may be followed by a PSP2 in 2011 or 2012.

IBM Looks to Secure Internet Banking With USB Stick

IBM's Zurich research laboratory has developed a USB stick that the company says can ensure safe banking transactions even if a PC is riddled with malware.

A prototype of the device, called ZTIC (Zone Trusted Information Channel), is on display for the first time at the Cebit trade show this week. IBM hopes to entice banks into buying it for online banking, which saves banks money on personnel costs but is constantly under siege by hackers.

When plugged into a computer, ZTIC is configured to open a secure SSL (Secure Sockets Layer) connection with a bank's servers, said Michael Baentsch, product manager for BlueZ Business Computing at the Zurich lab.

ZTIC is also a smart-card reader and can accept a person's bank card for verification. Once a PIN (personal identification number) is verified, a transaction can be initiated through a Web browser.

Web browsers, however, are a point of weakness for online banking because of so-called man-in-the-middle attacks.

Hackers have created malicious software programs than can modify data as it is sent to a bank's Web server but then display the information the consumer intended in the browser. As a result, a person's bank account could be emptied. Man-in-the-middle attacks are also effective even if the bank's customer is using a one-time password generator.

The ZTIC, however, bypasses the browser and goes directly to the bank. It ensures that the data exchanged is accurate.

For example, say a bank customer wants to transfer money. The customer will input US$100 into a form in the browser. The bank's servers will then try to confirm the amount. During a man-in-the-middle attack, the attacker is capable of transferring $1,000 but can modify the confirmation message to still show $100.

Since it has a direct secure connection with the bank's servers, the ZTIC will show the amount that actually has been requested to be sent. So even if the browser shows a confirmation for $100, the ZTIC will show $1,000, indicating a man-in-the-middle attack in progress, Baentsch said. The user would know to reject the transaction and press the red "x" button on the ZTIC.

"If malware is attacking your online banking transaction, it will show you something strange has happened," Baentsch said.

IBM expended a lot of effort to figure how to initiate an SSL session within a USB stick, Baentsch said. It takes some processing muscle, and since the USB runs independent of the PC, it does not have access to the computer's processor.

ZTIC uses a chip from microprocessor designer ARM, and the software has been designed so it can quickly establish a SSL session, Baentsch said. Although it is a memory stick, no data can be stored on it, which also prevents malicious software from infecting it.

Using ZTIC would also prevent phishing attacks, where a fraudulent Web site tries to elicit sensitive details from a user, and pharming attacks, where DNS (Domain Name System) settings have been tampered with, Baentsch said. ZTIC checks to ensure that the Web site has a valid security certificate.

IBM has internal figures on how much the ZTIC might cost for banks, but Baentsch wouldn't reveal them, saying that it would depend on the final design specifications of the ZTIC and other factors.

Extend Laptop Battery Life With Hassle-Free PC

Three Quick Ways to Improve Laptop Battery Life

Like chocolate and episodes of Mad Men, there's no such thing as too much battery life. Alas, it's the rare notebook battery that'll give you more than a few hours--unless you know some tricks for squeezing extra juice. (And by the way, if you like these tips, be sure to check out "Tips for Laptop Users.")

Remember these three tips the next time you travel:

  1. Disable Bluetooth and Wi-Fi. Few airplanes offer Wi-Fi (yet), so turn off your notebook's power-sucking Wi-Fi radio. Same goes for Bluetooth.
  2. Drop the screen brightness. You can afford to keep screen brightness cranked up when your notebook is plugged into an outlet, but not when you're flying coach. Drop the brightness setting a few notches, then get back to work. Chances are you'll hardly notice the difference. Then drop it a few more notches. The lower, the better.
  3. Watch downloads, not DVDs. Notebooks are great for watching movies, but DVD drives consume a considerable amount of power. Leave the DVDs behind and choose digital downloads instead. Stock your hard drive with movies from Amazon or iTunes and you'll be able to watch longer. Don't want to pay for movies you already own? Use a tool like Handbrake to rip your DVDs, creating MPEG-4 files you can store on your hard drive (or put on your iPod).

Give Your Laptop Battery a Longer Lease on Life

Does your laptop spend more time on your desk than your lap? If so, you're probably causing your battery to wear out much sooner than it needs to.

See, it's a sad (and expensive) fact of life: You're lucky to get 18-24 months from a battery before it loses a good chunk of its charge capacity (meaning it no longer powers your laptop for as long as it used to).

And you're accelerating this unfortunate timeframe if you leave your laptop plugged in 24/7, which is common for most folks who work at a desk. Because the battery rarely (if ever) gets a chance to discharge, it loses its capacity to hold a charge.

The simple solution: Pull the battery out of the laptop and leave it out when you're deskbound. Most laptops can run on straight AC power, so there's no need for the battery. And it's easy enough to pop back in when you hit the road (though obviously you'll want to make sure it's charged, so plan ahead a bit).

It's a hassle, sure, but consider the price of a replacement battery: usually $100 or more. What's more, old, discarded batteries wreak havoc on landfills. Sooner or later, they'll leak acid into the ground. So it's in your best interests to keep your battery as long as possible, and to keep it from dying a premature death.

Turn Vista's Sleep Button Into a Power Button

As a recent Windows Vista convert (I just couldn't cling to XP any longer--a subject for another day), I'm mostly liking the OS. But I do have one small grievance: When I click the Start button and then click what looks like a power button, my system doesn't actually shut down. Instead, it goes to sleep.

Hey, Microsoft: I don't want it to go to sleep. I want it to shut down! But that requires an annoying extra step: I have to mouse over to another menu and choose Shut Down from a list of half a dozen options. If I'm in a hurry, it's way too easy to inadvertently click the wrong wrong.

Fortunately, there's a way to reprogram that "sleep" button to become an actual power button. Here's the process in a nutshell:

  1. Open the Control Panel and go to Power Options.
  2. Click Change plan settings for your selected power plan.
  3. Click Change advanced power settings.
  4. Expand Power buttons and lid.
  5. Expand Start menu power button.
  6. Change the setting from Sleep to Shut down.
  7. Click OK.

Wow, could Microsoft have buried that setting any deeper? Thankfully, Windows 7 makes it much easier to reprogram this button's function. Now if we could just get an honest-to-goodness Shut Down button that doesn't require a visit to the Start menu, we'd really be making progress.

Rick Broida writes PC World's Hassle-Free PC blog Sign up to have Rick's newsletter e-mailed to you each week.

Apple Launches New iMacs, MacMinis, Mac Pros

As had been rumored, Apple this morning upgraded its line of iMacs, MacMinis, and Mac Pros, with improved processor speeds and upgraded storage space. All of the new computers are now available in the Apple store.

The new iMacs, available in both 20-inch and 24-inch models, feature double the amount of RAM found in previous models : 2GB for the 20-inch version and 4GB of RAM for the 24-inch. Hard drive space also gets doubled, to 320GB for the 20-inch model, and 640GB and 1TB for the 24-inch model. The prices range from $1,199 for the base 20-inch 2.66-GHz model up to $2,199 for the 24-inch 3.06-GHz model with 1TB or storage space.

Mac Minis received a speed bump as well with 2.0-GHz processors, more disk space, and NVIDIA GeForce 9400 graphics cards. Two new models are available: For $599 you can get a 2.0-GHz Mac Mini with 1GB of RAM and a 120GB hard drive; a version with the same processor speed, 2GB of RAM, and a 320GB hard drive goes for $799.

Two MacPro models are available as well. A $2,499 model features a Quad-Core 2.66-GHz Intel Xenon Nehalem processor, 3GB of RAM, and a 640 GB hard drive, while a $3,299 model brings you two 2.66-GHz Intel Xenon Nehalem processors (8-Core), 6GB of RAM, and a 640 GB hard drive.

Also, both of the company's new wireless devices, Time Capsule and AirPort Extreme now offer simultaneous dual-band Wi-Fi on both the 2.4-GHz and 5-GHz bands, allowing all the devices on the network to use the most efficient band automatically. This will allow them to serve wireless Internet connections to both Macs and PCs, alongside Wi-Fi devices such as iPhone, iPod touch, and Apple TV. Time Capsule costs $299 for the 500GB model and $499 for the 1TB model. Airport Extreme comes at $179.

Friday, February 20, 2009

Intel goes to court in licensing spat with Nvidia

Intel Corp. went to court this week to resolve a licensing dispute with Nvidia Corp. over the latter's plan to build chip sets compatible with Intel's latest Nehalem processors.

In a filing in the State of Chancery Court in Delaware, the chip maker on Tuesday asked a judge to rule that Nvidia is not licensed to produce chip sets that are compatible with any Intel processors with integrated memory-controller functionality, such as Intel's Nehalem microprocessors.

Intel launched its first Nehalem chips in November, when it introduced the Core i7 chips. The new chips integrate memory controllers inside the chip, which helps the CPU communicate with the memory faster. Future Intel laptop and desktop processors are also expected to include integrated memory controllers.

Intel had discussions with Nvidia for more than a year attempting to resolve the matter, but the talks were unsuccessful, said Intel spokesman Chuck Mulloy. Intel had to go to court to resolve this dispute, Mulloy said.

In response to the court filing, Nvidia on Wednesday said that a four-year-old bus license with Intel allowed it to build chip sets based on Intel CPUs with integrated memory controllers.

"We are confident that our license, as negotiated, applies," said Jen-Hsun Huang, Nvidia president and CEO, in a statement.

The license revolves around usage of a bus, or point-to-point interconnect, that helps the CPU communicate with components in a PC. Nvidia is "aggressively developing" new products for Intel's current interconnect and Intel's future DMI (direct media interface) bus.

Nvidia makes chip sets -- a set of integrated circuits -- for Intel and for Advanced Micro Devices Inc.'s CPUs to help processors communicate with components like network and storage controllers.

As CPUs integrate more capabilities like graphics, Intel may be trying to gain more control over its future chip technology, said Nathan Brookwood, an analyst at Insight64. Intel plans to integrate graphics in a two-chip package it is expected to start shipping later this year.

"Intel [could] be saying 'We gave you some technology to go into old processors ... now we're not going to let you do that anymore,'" he said.

But Nvidia CEO Huang said Intel's CPU business is decaying, and the court filing is an attempt to save it.

"At the heart of this issue is that the CPU has run its course and the soul of the PC is shifting quickly to the [graphics processing unit]. This is clearly an attempt to stifle innovation to protect a decaying CPU business," Huang said.

Nvidia is trying to push GPUs as an alternative to CPUs, because GPUs execute advanced tasks like video encoding and decoding much quicker. It is also pushing the CUDA parallel programming architecture, a software tool kit that allows programmers to take advantage of the processing speeds of GPUs.

Both CPUs and GPUs are important and neither is going away soon, Brookwood said. However, CPUs are gaining more importance with GPUs taking on the role of a subordinate on laptops and desktops.

"Nvidia basically for the last year has been arguing that all the hard work is in the GPU and nobody's going to care about the CPU. Intel's been going in the opposite direction," Brookwood said.

As better graphics capabilities are integrated into CPUs, a smaller number of buyers will pay extra for a separate graphics card. That strikes at the heart of Nvidia, which is known for its graphics cards. The general computing trend is not on Nvidia's side, which is already facing a problem on how to grow its discrete graphics business during the recession, Brookwood said.


Analyst: HP financial results could mean bad news for Microsoft

Hewlett-Packard Co.'s financial results could mean more trouble for Microsoft Corp. as the PC market continues to struggle, a financial analyst said Thursday.

HP's "weak systems results" and fiscal outlook suggest the "PC market remains challenging and unlikely to improve over near-term," said a research note by Barclays Capital analyst Israel Hernandez.

HP on Wednesday posted declines in all of its businesses except its EDS services business for its fiscal first quarter, ended Jan. 31. The company, like most others in the technology and other business sectors, is being affected by the global economic recession.

"HP's results and commentary suggest that PC, as well as server demand, remains subdued, a condition that is likely to persist given slowing global macro conditions that are rapidly affecting both consumer and enterprise sales amid mounting corporate layoffs," Hernandez wrote.

As a result, analysts will likely lower consensus estimates for Microsoft's fiscal third quarter, which the company is in now, he wrote.

It's HP's PC results that will affect Microsoft the most, especially if the market continues to decline.

Microsoft blamed lagging PC sales for missing estimates for its fiscal second quarter, the results of which it announced on Jan. 22. The company also said it would lay off up to 5,000 workers, an unprecedented move for Microsoft.

Microsoft did not warn Wall Street or investors that it would not meet expectations prior to announcing its first-quarter results, as some had expected. However, it did announce results and layoffs before the U.S. markets opened that day, which it typically does not do.

Microsoft had expected the PC market to grow between 10% and 12% in the first quarter, but the market was flat. The company also blamed the increased interest in netbooks, or low-cost mini-notebooks, for its financial shortfall.

Barclays' Hernandez noted that netbooks are the only category of PCs showing growth, but that this is detrimental to Microsoft's client operating margin and its Windows average selling prices. Selling Windows XP on a netbook means less money for Microsoft than selling Windows Vista on a full-featured PC, he wrote.

Indeed, Microsoft's problem with netbooks lies with Vista, which has too big a hardware footprint to run reliably or well on netbooks. XP runs fine, so XP and Linux are what original equipment manufacturers are preinstalling on most netbooks.

Microsoft hopes to alleviate the Vista problem and take more advantage of the netbook market in the future with Windows 7, the next version of its client operating system. Microsoft has said all versions of Windows 7 will run well on netbooks. Windows 7 is expected to be available before the end of the year or, at the latest, the beginning of 2010.

Microsoft is scheduled next week to give analysts an update on its current quarter on a conference call, which should give a better sense of how its quarter is shaping up.

Take Windows 7 for a spin with VirtualBox

Everyone likes to try new and shiny technology toys like the Windows 7 beta, but when the price is having to replace your existing operating system, that's too much for most people. That's when being able to use a virtualization program can come in darn handy.

To test out how well Windows 7 works on a virtualized system, I decided to use Sun's VirtualBox software. While there are, of course, other virtualization programs out there, such as VMware's Workstation and Parallels Desktop, VirtualBox has two significant advantages over the others. First, it's free. Second, you can use it with several operating systems, including Windows, Linux, Macintosh and OpenSolaris.

In my case, I decided to use VirtualBox to run Windows 7 on two Dell Inspiron 530S systems, one running Windows XP Pro SP3 and the other running MEPIS 7 Linux. Each PC came with a 2.2-GHz Intel Pentium E2200 dual-core processor with an 800-MHz front-side bus, 4GB of RAM, a 500GB SATA drive and an Integrated Intel 3100 Graphics Media Accelerator. While not powerful systems, these proved to have more than enough CPU power to run both their native operating system and Windows 7.

Running VirtualBox

VirtualBox comes in two editions. The full VirtualBox is free for personal use and evaluation, but doesn't come with the complete source code. VirtualBox OSE (Open Source Edition), also free, does come with the source code and includes several enterprise-level features, such as an RDP (Remote Display Protocol) Server and USB support. (Other virtualization applications, like Xen, require tweaking before they'll support USB.) Both versions can run Windows 7.

In general, you'll need at least 1GB of RAM to run VirtualBox and a guest operating system. More RAM is always better. In my testing, I found that Windows 7 would actually run on as little as 512MB, while Vista really needs at least 1GB of its own.

VirtualBox should run on any recent CPU, but it does best with high-end processors that support hardware virtualization enhancements such as Intel's VT-x and Advanced Micro Devices' AMD-V.

The first step is to download a copy of VirtualBox. To run Windows 7 successfully, you'll need at least VirtualBox 2.1.0 -- I ran it on the latest version, VirtualBox 2.1.2.

VirtualBox
VirtualBox lets you run Windows 7 on a Linux system.
Click to view larger image

If you're a Linux or OpenSolaris user, you can also obtain a copy using your software package manager program. VirtualBox supports openSUSE, Fedora, Ubuntu, Debian, Mandriva, PCLinuxOS, RHEL (Red Hat Enterprise Linux), SLE (SUSE Linux Enterprise) and Xandros. You can also find additional support, both for specific operating systems and in general, in the FAQ file and in the User Manual (PDF).

On Windows and Mac OS X, installation requires little more than clicking on the installation file and letting it run. It's a bit more complicated on Linux and OpenSolaris. On Solaris, you need to compile the program. On Linux, you'll need to follow some additional steps, which are described in the Linux download section.

Finally, if you need more guidance, you can find step-by-step instructions for VirtualBox 2.1.0 at the Two Guys Tech site.

Setting up the VM

Your next step is to set up a new virtual machine for Windows 7. You do this by clicking the New button, which will then ask you how big a hard drive you want for the operating system. The default is to give it a 20GB virtual hard drive. With Windows 7, I decided to give it a more generous 40GB. You can also let VirtualBox dynamically determine how much hard drive room an operating system can have, but I prefer to decide for myself.

This done, you set up how much RAM and video memory Windows 7 can have. I prefer to give the operating system an ample 1GB of RAM and 128MB of video memory. You can get by with less, but you'll start noticing system delays.

VirtualBox also lets you set up 3-D graphics acceleration and access optical discs, USB devices, shared drives and so on through its main interface. You can set this up after you have Windows 7 installed, but I prefer to get this basic configuration out of the way first.

Laptop face-recognition tech easy to hack, warns Black Hat researcher

WASHINGTON -- The face-recognition technologies offered by some laptop vendors as a way for users to securely log onto their systems are deeply flawed and can be relatively easily bypassed, a security researcher warned today at the Black Hat security conference here.

Nguyen Minh Duc, a researcher at Bach Khoa Internetwork Security Centre, a Hanoi-based security organization, showed how attackers could break into laptops from Lenovo, Toshiba and Asus featuring face-recognition technologies, simply by using digitized images of the actual user of the systems in each case. The attacks were conducted on a Lenovo system with its Veriface III technology, an Asus system featuring its Smart Logon software and a laptop using Toshiba's Face Recognition technology.

The attacks are possible because the underlying technology used by the vendors for face authentication can be easily fooled -- meaning it cannot be trusted for secure log-on purposes, Minh Duc said. He claimed that each of the vendors has been notified of the issue and urged them to reconsider the use of face recognition as a secure log-in option until the problem has been fixed.

Toshiba, Lenovo and Asus are among a handful of vendors currently supporting face authentication as a secure log-in option. The idea is to let a user's face serve as a password for gaining access to a system. Instead of logging in with a username and password, users simply sit in front of a built-in camera on the system that captures an image of their face and compares selected features from the image with those previously registered by the user. Users are granted access only if the images match.

Laptop vendors have touted the technology as safer and easier than relying on usernames and passwords.

The problem, according to Minh Duc, is that face-recognition algorithms cannot tell the difference between a digitized image and a real face. Because the algorithms, in effect, process digital information sent via the camera, it is possible to trick the software with an image of a registered user of a system, he said.

An attacker could obtain a photo of the user and tweak the lighting and viewpoint with commonly available image-editing tools, he said. Because a hacker is unlikely to know what the face stored in the system looks like, he might have to create a large number of digital facial images -- each with different lighting and viewpoints -- to fool the face-recognition technology. An attacker would need to have a reasonable amount of experience with image editing and regeneration to successfully carry out such attacks, Minh Duc added.

At Black Hat, Minh Duc showed how to access laptops from each of the three vendors simply by placing digitized images of actual users in front of the built-in laptop cameras. The approach worked even when the face-recognition software was set to its highest security setting. With the Toshiba face-recognition technology, Minh Duc had to move the images a bit to fool the technology because it looks for facial movement. It is also possible to use black-and-white images to fool one of the systems, he added.

What makes the vulnerability in laptop face-recognition technology particularly dangerous is that compromises are harder to spot, Minh Duc said. An attacker could gain access to a system without the real user ever knowing about it, he claimed.

Intel Eyes Cloud Computing With New Hardware, Software

Intel is making a push into cloud computing with forthcoming changes in its Nehalem server line aimed at large data-center deployments.

As part of that initiative, the company earlier this week outlined hardware and software updates that it said will lead to energy savings and offer the scalability necessary for cloud-computing services.

Intel hopes to provide technology for low-range and midrange servers that can share workloads effectively if demand for a cloud application spikes, said Jason Waxman, general manager of high density computing at Intel. Server deployments would depend on resources needed by each cloud, with some requiring faster network connections or more memory. For example, hardware needs of a multimedia-intensive service like Google Earth would differ from those of an e-mail service like Gmail, Waxman said.

In addition to providing servers that deliver efficient cloud services, Intel wants the servers to be power-efficient. Waxman said that power consumption and cooling accounts for up to 23 percent of server deployments, so the company is building motherboards that could help cool systems efficiently while reducing energy costs.

Intel is developing a new motherboard, designed for servers used in cloud computing, that reduces power drawn to 85 watts in idle compared to 115 watts for standard Nehalem-based boards. A reduction of 30 watts per server could save up to US$8 million in three years in a deployment of 50,000 servers, Intel said.

The upcoming Nehalem-based boards will use Xeon processors due for release later this quarter. Intel will provide the motherboards through partners like Dell, Hewlett-Packard and IBM.

Specific motherboards can help cool down systems efficiently, which could reduce energy costs. Some of the redesigned motherboards remove slots to discourage use of power-hungry components and peripherals like graphics cards and hard drives. Users can instead access centralized storage over a network. Intel is also bunching together hot components so less would be needed to cool a system.

"We've actually worked with certain cloud service providers to ... change the fundamental settings to come up with something in the silicon -- whether it's the chipset or CPU -- to meet a particular optimized need," Waxman said.

The motherboards will include voltage regulators and work with software tools to monitor power consumption. One such software tool, called Dynamic Power Node Manager, will cap and balance power consumption between servers to cut energy costs. Intel tested Node Manager with Chinese search engine Baidu, which saved 40 watts per server during a cloud implementation.

Intel is also providing software tools like compilers and debuggers to improve performance and analyze software code. Optimizing code helps execute tasks more quickly and efficiently while using fewer system resources. That could save up $20 million over three years in a 50,000 server deployment, Waxman said.

The company has worked on optimizing search codes for most of the major search providers, Waxman said.

"We actually have people ... on site with these large cloud service providers doing hands-on tuning -- looking at their workloads ... to get more performance out of it," Waxman said.

One thing Intel can't control is the bottleneck of data throughput caused by slow network connections. Intel hopes to cut that with the VMDQ (Virtual Machine Device Queues) feature, which speeds up data throughput over virtual machines by intelligently queueing up server traffic. Hypervisors on servers with the queueing software work together to split traffic -- like storage and Web traffic -- to balance the traffic over multiple virtual machines. The feature cuts bottlenecks that typically affect a 1G bps network, Waxman said.

"In the past one virtual machine could hog up all the traffic. What you really want to be able to do is put things in a queue," Waxman said.

Taking advantage of virtualization technology, Intel also hopes to standardize the deployment of the DCMI (Data Center Management Interface) protocol across virtualized hardware and software environments to ease data-center task management. The specifications include features to measure power consumption to effectively share resources across a large-scale server deployment. For example, DCMI can cap power consumption on servers and monitor temperature to prevent servers from overheating.

Intel said that about 14 percent of servers purchased today go into a cloud deployment, Waxman said. That number will rise to 25 percent by 2012, with more cloud deployments going in large data centers of 50,000 servers or more, Waxman said.

Company Demos Development of Turn-by-turn IPhone GPS App

A friend of mine has been longing for a smartphone for years, but he's continued to hold off on the hopes that when he does eventually get one, it will do absolutely everything from making phone calls to preparing grilled cheese sandwiches. Phones these days have gotten pretty close to his ideal, but they still lack one key ingredient for him (besides the sandwich-making), and that's GPS-based turn-by-turn directions.

The iPhone's built-in Google Maps are serviceable for many purposes, but if you've ever used an in-car GPS system with automatic rerouting and a snide British voice telling you when you've taken a wrong turn, you'll know that it falls short in many places. Last July, Apple VP Greg Joswiak said that he expected that hole in the iPhone's functionality to be filled eventually, but that there were technical obstacles.

The obstacles appear not to be just technical either, as some reports indicate that Apple's SDK terms prohibit such features on the iPhone, though some of this may possibly be because of Google's own terms of use for Google Maps, which restrict using the information for "real time navigation or route guidance, including but not limited to turn-by-turn route guidance."

This doesn't appear to have stopped some developers. At this year's Mobile World Congress held in Barcelona, Sygic showed off a prototype turn-by-turn navigation app for the iPhone, including voice prompts and automatic route recalculation, though it only contained maps for Europe. Sygic--which has developed similar applications for a number of other platforms, including many cell phones--says that it plans on submitting the program to the App Store, just to see how the cookie crumbles.

Sygic's not the only company interested in the GPS market, either. A company called XRoad has had two GPS applications on the iTunes Store for a couple of months now; however, the apps don't feature voice prompts or live turn-by-turn directions, and the user reviews of the product have been heavily mixed. Last June, GPS unit maker TomTom also said that it had an iPhone app ready to go, but as of yet, that software has yet to materialize.

It would seem likely, per Joswiak's comments if nothing else, that some sort of official turn-by-turn direction system is coming to the iPhone eventually, whether it be via Apple or a partnering company.