How People Actually Use AI for Data
We analyzed a sample of conversations with the Zerve coding agent (3,394 real interactions) to understand how people use AI in their daily work. The results show that users are not treating it as a novelty. They are integrating it into how they think, code, and communicate.
AI as a Working Partner
Most sessions are not single purpose. A typical conversation combines writing, data exploration, debugging, and code review. About 86 percent of sessions include multiple activities, with an average of nearly three distinct goals. The most common pattern involves a user writing, analyzing, and visualizing data in one continuous workflow. Instead of switching between tools, they stay in one environment and iterate with the agent as a collaborator.
Content Creation That Is Technical and Structured
Nearly seventy percent of all conversations involve content generation. This is not limited to rewriting text or fixing grammar. Users are producing documentation, analytical reports, technical walkthroughs, and structured outputs that mix narrative and computation. The agent often acts as a co-author for reports or presentation material created from live data.
Data Analysis at the Center
Even when analysis is not the main goal, it still appears in most sessions. About 82% include some form of data exploration. Users build models, interpret datasets, and validate assumptions through conversational coding. Roughly three quarters produce visualizations or summaries that are refined enough to share directly with colleagues or clients. AI is becoming the link that ties data work together across steps that once required several tools.
Where It Happens Most
The largest segments come from technology and education, which together account for about 60 percent of all usage. Developers use the agent for prototyping, debugging, and automation. Educators and instructional designers use it to create lessons, exercises, and assessment visuals from raw data
The Larger Shift
Users are moving away from simple question and answer interactions toward full workflow collaboration. They treat AI as a reasoning and coding partner that can manage context across multiple steps of a project. For organizations, this means AI is no longer just for individual tasks. It is becoming a central part of how work flows from idea to implementation. Companies that adopt this mindset will move faster and think more broadly than those that see AI as a search tool or a shortcut.
Try Zerve to see how an AI coding partner can streamline your data work from the first prompt to the final result.
Frequently Asked Questions
What are data scientists primarily doing in Zerve?
Data scientists in Zerve focus on automating and improving pipelines, streamlining exploratory data analysis, running machine learning workflows, preparing data for modeling, turning work into insights, and reusing and sharing workflows.
How does Zerve help in automating and improving data pipelines?
Zerve enables users to fix and speed up their data pipelines by providing tools that automate repetitive tasks and optimize workflow efficiency, ensuring smoother and faster data processing.
In what ways does Zerve streamline Exploratory Data Analysis (EDA)?
Zerve allows users to easily load data and check distributions, making EDA accessible throughout various stages of the project to better understand data characteristics quickly.
Can Zerve simplify running machine learning workflows?
Yes, Zerve simplifies training and validating machine learning models without the need for complex setup or coding, enabling users to focus on model performance and experimentation.
How does Zerve assist with preparing data for modeling?
Zerve addresses the time-consuming task of data cleaning by providing efficient tools and workflows that help clean and prepare datasets effectively for modeling purposes.
Does Zerve support reusing and sharing workflows among teams?
Absolutely. Users often build reusable workflows within Zerve, which can be shared across teams to promote collaboration, consistency, and efficiency in data science projects.

