

By combining an intelligent orchestration layer with a robust runtime engine, organizations can scale their AI integration capabilities while maintaining operational control.
Welcome back to the 2017 Definitive Data Operations Report MiniSeries! This week, we’re covering key data activities in DataOps as data continues to grow and how it affects the need for hiring as well as tools for hiring.
Wait, what are they doing? Data Activities.
Every day, data professionals and executives are spending most of their time on analysis. On average, 22% of respondents spend time on analysis with some respondents reporting they spend 100% of their time on analyzing data–this is no surprise. However, when looking at the overall picture of key DataOps activities, integration, troubleshooting, data pipelines, and ETL jobs combined take up almost half of respondent’s time.
These tasks are central and crucial to maintaining data operations efficiently in this time of data growth. The more time data pros and executives need to spend on DataOps tasks, the less time is spent on data analysis in order to derive value.
Fast & Furious: Data Growth
In fact, 63% of respondents reported their data is growing at least 100 Gigabytes per day with 13% of respondents reporting their data is growing at least 1 terabyte a day. One terabyte can hold 33 copies of the entire Star Wars saga. Calculated over a month, that’s almost eight thousand movies!
Sharing is caring
With data growing so quickly, it is important to find efficient ways to ingest and share data. Surprisingly, the most popular tools currently used to share data are FTP (37%) and Dropbox (16%). Given how long these tools have been around, it will be interesting to see how much longer these formats remain the most popular.
Download the full report here.
Read last week’s recap here.
Read the recap of conclusions here.
Instantly turn any data into ready-to-use products, integrate for AI and analytics, and do it all 10x faster—no coding needed.
br>