Dataops
   HOME

TheInfoList



OR:

DataOps is a set of practices, processes and technologies that combines an integrated and process-oriented perspective on data with automation and methods from agile software engineering to improve quality, speed, and collaboration and promote a culture of continuous improvement in the area of
data analytics Analytics is the systematic computational analysis of data or statistics. It is used for the discovery, interpretation, and communication of meaningful patterns in data. It also entails applying data patterns toward effective decision-making. It ...
. While DataOps began as a set of best practices, it has now matured to become a new and independent approach to data analytics. DataOps applies to the entire data lifecycle from data preparation to reporting, and recognizes the interconnected nature of the data analytics team and information technology operations. DataOps incorporates the Agile methodology to shorten the cycle time of analytics development in alignment with business goals.
DevOps DevOps is a set of practices that combines software development (''Dev'') and IT operations (''Ops''). It aims to shorten the systems development life cycle and provide continuous delivery with high software quality. DevOps is complementary to a ...
focuses on continuous delivery by leveraging on-demand IT resources and by automating test and deployment of software. This merging of software ''development'' and IT ''operations'' has improved velocity, quality, predictability and scale of software engineering and deployment. Borrowing methods from DevOps, DataOps seeks to bring these same improvements to data analytics. DataOps utilizes
statistical process control Statistical process control (SPC) or statistical quality control (SQC) is the application of statistical methods to monitor and control the quality of a production process. This helps to ensure that the process operates efficiently, producing m ...
(SPC) to monitor and control the data analytics pipeline. With SPC in place, the data flowing through an operational system is constantly monitored and verified to be working. If an anomaly occurs, the data analytics team can be notified through an automated alert. DataOps is not tied to a particular technology, architecture, tool, language or framework. Tools that support DataOps promote collaboration, orchestration, quality, security, access and ease of use.


History

DataOps was first introduced by Lenny Liebmann, Contributing Editor,
InformationWeek ''InformationWeek'' is a digital magazine which conducts corresponding face-to-face events, virtual events, and research. It is headquartered in San Francisco, California and was first published in 1985 by CMP Media, later called Informa. The ...
, in a blog post on the IBM Big Data & Analytics Hub titled "3 reasons why DataOps is essential for
big data Though used sometimes loosely partly because of a lack of formal definition, the interpretation that seems to best describe Big data is the one associated with large body of information that we could not comprehend when used only in smaller am ...
success" on June 19, 2014. The term DataOps was later popularized by Andy Palmer of Tamr and Steph Locke. DataOps is a moniker for "Data Operations." 2017 was a significant year for DataOps with significant ecosystem development, analyst coverage, increased keyword searches, surveys, publications, and open source projects. Gartner named DataOps on the Hype Cycle for Data Management in 2018.


Goals and philosophy

The volume of data is forecast to grow at a rate of 32% CAGR to 180 Zettabytes by the year 2025 (Source: IDC). DataOps seeks to provide the tools, processes, and organizational structures to cope with this significant increase in data. Automation streamlines the daily demands of managing large integrated databases, freeing the data team to develop new analytics in a more efficient and effective way. DataOps seeks to increase velocity, reliability, and quality of data analytics. It emphasizes communication, collaboration, integration, automation, measurement and cooperation between
data scientists Data science is an interdisciplinary field that uses scientific methods, processes, algorithms and systems to extract or extrapolate knowledge and insights from noisy, structured and unstructured data, and apply knowledge from data across a ...
, analysts, data/ETL (
extract, transform, load In computing, extract, transform, load (ETL) is a three-phase process where data is extracted, transformed (cleaned, sanitized, scrubbed) and loaded into an output data container. The data can be collated from one or more sources and it can also ...
) engineers,
information technology (IT) Information technology (IT) is the use of computers to create, process, store, retrieve, and exchange all kinds of Data (computing), data . and information. IT forms part of information and communications technology (ICT). An information te ...
, and quality assurance/governance.


Implementation

Toph Whitmore at Blue Hill Research offers these DataOps leadership principles for the
information technology Information technology (IT) is the use of computers to create, process, store, retrieve, and exchange all kinds of data . and information. IT forms part of information and communications technology (ICT). An information technology system (I ...
department: * ''“Establish progress and performance measurements at every stage of the data flow. Where possible, benchmark data-flow cycle times.'' * ''Define rules for an abstracted semantic layer. Ensure everyone is “speaking the same language” and agrees upon what the data (and metadata) is and is not.'' * ''Validate with the “eyeball test”: Include continuous-improvement -oriented human feedback loops. Consumers must be able to trust the data, and that can only come with incremental validation.'' * Automate as many stages of the data flow as possible including BI, data science, and analytics. * ''Using benchmarked performance information, identify bottlenecks and then optimize for them. This may require investment in commodity hardware, or automation of a formerly-human-delivered data-science step in the process.'' * ''Establish governance discipline, with a particular focus on two-way data control, data ownership, transparency, and comprehensive
data lineage Data lineage includes the data origin, what happens to it, and where it moves over time. Data lineage gives visibility while greatly simplifying the ability to trace errors back to the root cause in a data analytics process. It also enables repl ...
tracking through the entire workflow.'' * ''Design process for growth and extensibility. The data flow model must be designed to accommodate volume and variety of data. Ensure enabling technologies are priced affordably to scale with that enterprise data growth.”''


Events

* Data Opticon * Data Ops Summit * Data Ops Online Champion


References

{{Reflist Data analysis