Archive for the ‘1’ Category

A Near-Sourcing trend in 2010 for the U.S.?

January 7, 2010

CIO.com recently published its predictions for 10 trends in 2010 (December 2009).  Chief among the “good news” is that expectations are that the market for outsourcing in 2010 will improve with the general economic recovery.

Trends expected for 2010, however, may alter the landscape for both customers and providers. Among the most interesting of the cited trends is the Offshoring to America. That said on-shore U.S. locations, particularly in economically distressed areas, are NOW beginning to look like relative bargains compared to other global markets. This will likely be particularly true for public sector outsourcing, where U.S.-based solutions are likely to get preferential treatment.

If the U.S. continues to keep up its “Weak Dollar” strategy in 2010, I believe that we will see an increase in the number of “Near-Sourcing” companies being created in the lower cost of living areas and lower-income tax areas of the U.S. —in states like Arkansas, Louisiana, Tennessee and Nevada.

Outsourcing to India, China, South America and Eastern Europe will still thrive but some of the risks associated with outsourcing to these countries might be mitigated by the use of some of the newer Near-Sourcing alternatives.

Intel’s 48-core processor to compete with the human brain

December 10, 2009

Pushing several steps farther in the multicore direction, Intel demonstrated a fully programmable 48-core processor on December 2.  The company believes that it will pave the way for massive data computers powerful enough to do more of what humans can.

The 1.3-billion transistor processor, called Single-chip Cloud Computer (SCC) is successor generation to the 80-core “Polaris” processor that Intel’s Tera-scale research project produced in 2007. Unlike that precursor though the second-generation model is able to run the standard software of Intel’s x86 chips such as its Pentium and Core models.

The cores themselves aren’t terribly powerful–more like lower-end Atom processors than Intel’s flagship Nehalem models according to Intel Chief Technology Officer Justin Rattner at a recent press event.   He went on to say that “collectively they pack a lot of power”  and that Intel has ambitious goals in mind for the overall project.  “The machine will be capable of understanding the world around them much as humans do,” Rattner said. “They will see and hear and probably speak and do a number of other things that resemble human-like capabilities, and will demand as a result very (powerful) computing capability.”

Intel is working with companies facing large-scale computing challenges that today require thousands of networked servers. This is pretty much a here-and-now problem compared to the more sci-fi challenges of computer vision! The chipmaker found only one flaw with the chip so far and has booted Windows and Linux on SCC systems. The company has also demonstrated computers using the processor running Microsoft’s Visual Studio on Windows and other tasks.

Going Green in the Data Center

November 9, 2009

In the last several blogs I wrote about the prevalence of outsourcing to reduce cost within the Enterprise. I would like to take a brief look into another method of cost reduction which is gaining momentumand that is the “Green Data Center through Virtualization” and “Cloud Computing.”

IT is in the middle of a fundamental transition from the rigid traditional data centers toward a more responsive model where needs are met far faster and more efficiently. Over the past several years, many IT departments have committed to virtualization as a solution to the spiraling energy costs and inflexibility plaguing corporate data centers. By running applications on virtual servers and consolidating underutilized hardware, data centers can get maximum value from their equipment. By utilizing the existing servers in the Data Center in a virtualized environment the cost of energy utilized to run, cool and operate the data center can be impacted. 

While virtualization helps companies reduce energy costs and improve agility there is another step that can be taken (with care) by introducing cloud computing infrastructure solutions in to the environment. Cloud Computing is a form of computing in which all applications, information and resources are managed in a virtual environment. The term cloud computing and specifically the use of the word “cloud,” means to represent the nature and structure of a cloud. Cloud computing involves virtual hosted environments allowing users to connect to the services being hosted over the internet.

With that said, cloud computing is not for everyone, but is does pose an interesting solution to going Green with in the data center.

Rather than increasing the number of servers and storage in the data center even within a virtualized environment, a new cloud computing model will allow companies to get out of the computing infrastructure business–where appropriate–retaining only the portion that is essential to the enterprise. As the cloud environment becomes more mature and secure, purchase decisions will be framed by asking: Should we be really be doing this ourselves, or can someone else do it better and at lower cost?  Essentially it is another type of outsourcing.

In the end, the best way to think about this is probably to view it as being yet another type of application deployment architecture. If the challenges that cloud computing is facing today, such as security, can be overcome then CIOs have another tool to reduce overall IT costs and contribute to the “Green Data Center” Concept.