Passionate about your results
  About Us      Services      Products    Industries  Partners    Careers    Awards News Contact Us 29

Marlabs Blog

Stay connected with the
worldwide business
The Future is Analytics Monday, January 11, 2010

The increasing digitization of our day-to-day life has meant an exponential surge in the volumes of data being generated. The Internet apart, transformational infrastructure projects such as smart grids that lower energy consumption, sensors that help reduce traffic congestion, and electronic medical records for personalized healthcare are generating enormous amounts of data that require interpretation.

In 2007, there were 281 exabytes of information, and by 2011 we are expected to have over 1,800 exabytes of information available. There are currently 107 million active Internet domains and 1.5 billion Internet users (roughly 25% of the population). Over 133,000 new sites are added to the Web every day.

Not surprisingly, analytics has emerged as a strategic imperative, as companies seek a competitive advantage in today’s digitized economy. In a recent study, 83% of CIOs polled identified business intelligence and analytics as their top priority to enhance their organizations’ competitiveness.

Realizing its growing criticality, IBM, it seems, is betting big on analytics. In April 2009, they launched a new services unit with 4,000 consultants devoted to the field. Subsequently, they opened seven analytics centers around the world including ones in New York, London and Beijing. And the company recently acquired data analytics company SPSS for $1.2 billion and business analytics firm RedPill.

Taking a step further, IBM has unveiled a new internal analytics product that the company is touting as the “largest private cloud computing environment for business analytics in the world.” Called Blue Insight, it will provide the over 200,000 employees in IBM’s sales and development department with the ability to extract data and information to make decisions and gain further insight at the point of sale. Blue Insight will gather information from nearly 100 different information warehouses and data stores, providing analytics on more than a petabyte (1,000 terabytes or 1,000,000 gigabytes) of data.

Posted by Srinivasan Balram | No Comments
VMWare Fusion On Mac Delivers a Superior Windows 2008 Experience Wednesday, January 6, 2010

While on the topic of Mac vs PC, a friend remarked about how the Mac is one of the best platforms to run Windows Server 2008! This got my curiosity going and I decided to put it to the test. The results had me smiling as Windows 2008 did indeed perform better on the Mac through VMware Fusion. That too while consuming fewer resources:

Mac – Dual-core + 2G Memory

PC – Quad core + 8G Memory

Haven’t turned on my PC in more than a week!

Screen shot 2009-12-31 at 1.14.18 AM

Screen shot 2009-12-31 at 1.15.05 AM

Posted by Srinivasan Balram | No Comments
SPICE/PC-over-IP (PCoIP)/ Independent Computing Architecture (ICA) — Back to the Future! Tuesday, January 5, 2010

In an interesting development, Red Hat has open sourced its Simple Protocol for Independent Computing Environment (SPICE) virtual desktop protocol. SPICE was originally developed by Qumranet, the company Red Hat acquired in 2008.

SPICE, PC-over-IP (PCoIP) and Independent Computing Architecture (ICA) are all basically advances/alternatives to the RDP (Remote Desktop Protocol). But the disadvantage with RDP was that it always needed a client app, and so a client OS. This has, however, changed with the next wave of products, VMware/Traduce, Microsoft/Celesta and Red hat/SPICE, which neither require a client OS nor a special app (just a monitor with a special card). So effectively the client has become a silent terminal with a host OS.

SPICE enables Virtual Machines (VM’s) to act as a host OS for remote desktops. This truly actualizes the world of “cloud computing” i.e. a single server hosting a number of VMs with remote desktop capability. Here’s the architecture:

The following link takes us back 20 years to the beginning of the client/server era as IBM has just announced mainframes which can run 64 instances of Linux.

So imagine using a monitor and being able to switch between different VM’s on different backends (backend could be a mainframe or a cluster or just a basic server).

Posted by Srinivasan Balram | No Comments
Cloud Computing in Silicon Monday, January 4, 2010

Intel has created an experimental “Single-chip Cloud Computer,” (SCC) a research microprocessor containing the largest number of Intel Architecture cores ever integrated on a silicon CPU chip — as many as 48 cores. It incorporates technologies intended to scale multi-core processors to 100 cores and beyond, such as an on-chip network, advanced power management technologies, and support for “message-passing.”

Architecturally, the chip resembles a cloud of computers integrated into silicon. The novel, many-core architecture includes innovations for scalability in terms of energy-efficiency including improved core-to-core communication and techniques that enable software to dynamically configure voltage and frequency to attain power consumptions from 125W to as low as 25W.

Posted by Srinivasan Balram | No Comments
Build your Own Java Profiler Saturday, January 2, 2010

Faced with troubleshooting a performance issue, developers are most likely to rely on a combination of loggers, debuggers and various open-source profilers. But what if you could develop your own profiler, that too almost effortlessly.

I came across this article which explains how easily and quickly you can write your own profiler. Profilers tend to be rather expensive to run, although most have features (like selective instrumentation, sampling, etc.) that are designed to minimize their run-time impact. But they provide a general functionality, and if you learn how they are built, you can develop a compact profiler that is narrowly targeted to meet exactly what you need to do.

Posted by Srinivasan Balram | No Comments
Stanford Researches Turn Paper into Batteries Saturday, January 2, 2010

Stanford University researchers have demonstrated a way of using nano technology to turn ordinary paper into a battery. The process involves coating any ordinary sheet of paper with special ink made of carbon nanotubes and silver nanowires. Once this is done, the paper is baked and it then becomes a highly conductive storage device.

Thanks to the small diameters of these materials, the ink sticks strongly to the fibrous paper, allowing the battery to be extremely durable. According to the researchers, the paper batteries will be low-cost, may be crumpled or folded, and can even be soaked in acidic or basic solutions, yet their performance does not degrade.

Peidong Yang, professor of chemistry at UC Berkeley is quoted as saying, “This technology has the potential to be commercialized within a short time. I don’t think it will be limited to just energy storage devices. This is potentially a very nice, low-cost, flexible electrode for any electrical device.”

Posted by Srinivasan Balram | No Comments
  Blogger Profiles
  Linked in
  Marlabs on
  Follow us
on Twitter
  Read our Feed