Thibaut Loilier is in charge of Market Research for GFT, member of the innovation team and author of GFT public whitepapers (Open Innovation, Mobile Banking, Mobile Payment and IT trends). Before joining GFT, he worked as Business Strategy Analyst for BNP Paribas in San Francisco with responsibility for strategic analysis and relationship with Silicon Valley’s technology and innovation ecosystems (start-ups, academics, clusters…). He brings his experience within and knowledge of the start-up industry & ecosystem to the CODE_n initiative.
Big data is everywhere. Servers around the world are storing 98,000 tweets, 695,000 Facebook status updates, 1 million instant messages, 698,445 Google searches, and 168 million emails – 1,820 terabytes of data – every single minute. The sheer size and speed at which this data is being gathered are almost unimaginable. But imagine we must, if we’re to make the most of this vast treasure trove of data. Data is accumulating faster than technology can cope.
Over the next weeks and months, we’ll take an in-depth look at five different aspects of big data, all vital elements in the process of monetising that data: infrastructure, software, security, legal implications, and exploitation. By spotlighting the challenges and the innovative work that’s going on in these areas today, we hope to educate and inspire the new Big Data generation.
Big data needs big storage, so careful planning is needed. Visionaries like data scientist Jeff Hammerbacher at Cloudera are setting the tone for productive big data architectures, which in turn need to be backed by big dollars, and InfoChimps, recently sold to consulting firm CSC, developed a popular big data query and processing platform. Continuuity is partnering for success with Java developers to build and run innovative Hadoop™ and HBase™ applications.
Developers and the cloud are the keys to effective big data systems implementation and management. Cloudera co-founder went on to found WibiData, targeting big data usage for retail and finance on mobile and SaaS platforms. Domo is making big data intelligence easily accessible to businesses, and Data Gravity is tackling storage issues for mid-sized businesses. Waiting in the wings is stealth-mode Sqrrl with a secure open source Apache/NoSQL platform.
Big data is a big security challenge, because it’s dynamic and both structured (factual) and unstructured (metadata). There’s no silver bullet (yet) but companies like Packetloop (analysing network traffic in real time), Dataguise (discovery and masking of sensitive information), and ZettaSet (creating a secure wrapper around Hadoop distributions) are doing interesting work. Security vendors are also using big data to plot trends and develop predictive protection.
Legal Implications and Compliance
A big part of the security challenge is data protection and legislative compliance. There’s also a question of jurisdiction; with cloud storage, do organisations know what data is stored where? Developers like Data Boiler Technologies are betting companies will be outsourcing these tasks to third-party specialists. And like security, big data is also being widely used in law enforcement to fight crime, especially in the financial sector, where money laundering is a major concern.
Exploiting Big Data
Where the rubber meets the road. Whether it’s predicting health service needs, analysing retail sell-through patterns and restocking accordingly, detecting malware, preventing fraud, or providing disaster relief, the necessary data is there – it just needs the right key to unlock it. Take a look at some of the hottest developments going on today in Big Data at Big Data News. Then let us know in the comments below your goals for exploiting big data in your organisation – or any other organisation.
It’s going to be an interesting journey, so why not join us for the ride?