LONDON, July 7, 2016
Big Data Partnership, Europe’s largest independent big data professional services consultancy, has today launched its Data Capability Framework (DCF). This unique, first-to-market proposition is a customised evaluation process on how to optimise big data assets at any stage of the deployment. It has been developed to help businesses maximise value, accelerate speed to market and provide delivery assurance. The comprehensive assessment is followed by recommendations for 22 individual capabilities across three streams, Technology, Business and Governance.
This follows research conducted by Big Data Partnership amongst senior executives at the recent Big Data Analytics event, which found that nearly half (44%), cited assessing the essential capabilities internally as their biggest challenge when implementing a big data strategy. Choosing the right tools and technologies was the next biggest challenge faced by many organisations (41%) followed closely by missing skills and talent (38%).
Big Data is clearly one of the hottest topics in the technology space that cannot be ignored. According to a recent forecast from International Data Corporation (IDC), the global big data market is expected to be worth $48.6 (US) billion by 2019, and how businesses make the most of the big data at their disposal is becoming an increasingly important C-level objective as the potential gains are huge.
Where DCF varies from existing market offerings is that it starts by working with the C-level stakeholders and placing business strategy at its core rather than limiting assessment to the technology stack. Following a three-day assessment of the specific needs of the business, a tailored plan is created that can be implemented by the organisation.
In a competitive marketplace where vendors are beginning to offer similar solutions, Big Data Partnership, CEO, Carmen Carey, points out: “DCF is not product driven; rather, it’s a holistic approach to change management. We’re reviewing companies’ technology stacks as well as whether they have the right skills to support big data technology and the governance to ensure that programmes are successfully delivered.”
Designed to prevent businesses spending unnecessary investment without fully understanding what they are looking to achieve, DCF provides a structured base for the implementation of Big Data Partnership’s full lifecycle 3D-methodology. In tandem with DCF, the proven process is used to fast-track big data deployments by providing expertise around creating the strategy, building infrastructure and programme execution.
“DCF is for companies that want to begin big data programmes but do not know where to start and those that are already leveraging big data but may not be aware of the latest technologies and the benefits,” says Big Data Partnership Head of Architecture Paul Quinn. “The focus of DCF is entirely around ensuring that businesses invest in the correct areas in line with their business goals. It means that businesses can rest easy knowing that they have not spent a lot of money on a big data programme that is unsuitable and that will not deliver business value.”
One of the UK’s largest employers is currently using DCF to determine the foundation for its multi-phased big data programme. Due to the size of the organisation and its extensive existing technical architecture, it was imperative that the client undertook an end-to-end evaluation to provide a structured base to launch an effective big data programme.
“This is as much about organisational clarity as it is risk management,” says Quinn. “When everyone is on the same page, there is a tangible increase in delivery confidence. Beyond that, it’s about moving companies to have a data-driven mindset. We are not just giving companies a quantitative way of understanding their architecture, we’re giving them a mechanism to implement and validate their big data strategies.”
Big Data Partnership works with its DCF customers to run a “health check” every six months to ensure goals are being delivered. The company does a full audit of the DCF assessment criteria routinely to ensure that it is closely aligned with the latest big data technologies.