The Open Data Platform Initiative (ODPi), a nonprofit consortium of big data industry leaders focused on the simplification and standardization of the big data ecosystem, announced today the release of its ODPi runtime specification and test suite to ensure applications will work across multiple Apache Hadoop distributions.
Descending from Apache Hadoop 2.7, the runtime specification includes HDFS, YARN, and MapReduce components and is part of the common reference platform ODPi Core.
The ODPi test framework and self-certification align closely with the Apache Software Foundation by leveraging Apache Bigtop for comprehensive packaging, testing, and configuration. More than half of the code in the latest Bigtop release originated in ODPi.
The runtime specification and test suite were designed to ensuring interoperability across the Hadoop ecosystem. A lack of standards The ODPi believes that the current ecosystem is slowed by fragmented and duplicated efforts.
“We aim to speed Hadoop adoption through ecosystem interoperability rooted in open source so enterprise customers can reap the benefits of increased choice with more modern data applications and solutions,” said Alan Gates, co-founder of Hortonworks, an ODPi member organization. “We are pleased to see ODPi’s first release become available to the ecosystem and look forward to our continued involvement to accelerate the adoption of modern data applications.”
“The turbulent big data market needs more confidence, more maturity, and less friction for both technology vendors and consumers alike,” said Nik Rouda, senior big data analyst at Enterprise Strategy Group (ESG). “ESG research found that 85 percent of those responsible for current Hadoop deployments believed that ODPi would add value.”
Click here to read the rest.
SOURCE: Data Informed
Leave a Reply
You must be logged in to post a comment.