How I'm using LLMs to transform my data analytics workflows

I’ve started using Large Language Models (LLMs) in my day-to-day data analytics, and the results have been so significant that I wanted to share my experience with you.

Article content

Making It easier to get started

As a former university instructor, I've observed students spending dozens of hours learning Python simply to complete basic coursework. The traditional learning curve is steep—often requiring weeks or months of dedicated study before becoming productive.

But here's my key insight: working professionals shouldn't have to follow this same path.

With LLMs as your coding partner, you can be productive with just basic Python knowledge. This approach dramatically lowers the barrier to entry for data analytics:

  • You focus on the business problem, not syntax details
  • The learning happens organically through practical application
  • You can start automating data tasks within days, not months
  • The code comes with explanations that enhance your understanding

This is particularly powerful for engineering and operations teams, and others who need data insights but aren't full-time developers. LLMs effectively democratise access to sophisticated data analysis techniques.

The benefits I've discovered:

By leveraging Large Language Models (LLMs) to generate Python scripts for my Jupyter notebooks, I've completely transformed how I approach data analysis. Here's why this combination has been so powerful:

🚀 Dramatic Time Savings: Tasks that used to take hours can now be completed in minutes. I describe what I need, and the LLM generates the code framework instantly.

🧩 Perfect for Prototyping: I can rapidly test different analytical approaches by simply describing what I want to try next.

💻 Code Customisation: I get working code that I can then modify and fine-tune for my specific needs - no more starting from scratch!

🔄 Iterative Improvement: The feedback loop is incredibly tight - I can request changes or enhancements and get updated code immediately.

🔒 Data Security: Your data remains entirely within your in-house Jupyter notebook environment - no need to share sensitive information with external services or platforms.

The workflow:


Article content
An LLM prompt to describe the data structure and analysis goals

  1. I start by clearly describing my data structure and analysis goals
  2. The LLM generates Python code optimised for Jupyter
  3. I run the code and evaluate results
  4. I request refinements as needed
  5. I document insights and share findings with stakeholders

Under this workflow, I described the analysis I wanted to an LLM, specifying the visualisations and statistical tests needed. Within minutes, I had a working Jupyter notebook with pandas data processing and seaborn/matplotlib for interactive visualisation.


Article content
Jupyter Notebook: interactive plot with widgets


For anyone who needs to use data analytics in their work, I highly recommend exploring how LLMs can enhance your workflow. You stay in control of your analysis while speeding up your process—think hours saved on reports or routine tasks. If you’ve hesitated because you’re not a coder, I especially urge you to give it a try—no coding skills needed!

To view or add a comment, sign in

More articles by Alex Choy

Others also viewed

Explore content categories