How I'm using LLMs to transform my data analytics workflows
I’ve started using Large Language Models (LLMs) in my day-to-day data analytics, and the results have been so significant that I wanted to share my experience with you.
Making It easier to get started
As a former university instructor, I've observed students spending dozens of hours learning Python simply to complete basic coursework. The traditional learning curve is steep—often requiring weeks or months of dedicated study before becoming productive.
But here's my key insight: working professionals shouldn't have to follow this same path.
With LLMs as your coding partner, you can be productive with just basic Python knowledge. This approach dramatically lowers the barrier to entry for data analytics:
This is particularly powerful for engineering and operations teams, and others who need data insights but aren't full-time developers. LLMs effectively democratise access to sophisticated data analysis techniques.
The benefits I've discovered:
By leveraging Large Language Models (LLMs) to generate Python scripts for my Jupyter notebooks, I've completely transformed how I approach data analysis. Here's why this combination has been so powerful:
🚀 Dramatic Time Savings: Tasks that used to take hours can now be completed in minutes. I describe what I need, and the LLM generates the code framework instantly.
🧩 Perfect for Prototyping: I can rapidly test different analytical approaches by simply describing what I want to try next.
💻 Code Customisation: I get working code that I can then modify and fine-tune for my specific needs - no more starting from scratch!
Recommended by LinkedIn
🔄 Iterative Improvement: The feedback loop is incredibly tight - I can request changes or enhancements and get updated code immediately.
🔒 Data Security: Your data remains entirely within your in-house Jupyter notebook environment - no need to share sensitive information with external services or platforms.
The workflow:
Under this workflow, I described the analysis I wanted to an LLM, specifying the visualisations and statistical tests needed. Within minutes, I had a working Jupyter notebook with pandas data processing and seaborn/matplotlib for interactive visualisation.
For anyone who needs to use data analytics in their work, I highly recommend exploring how LLMs can enhance your workflow. You stay in control of your analysis while speeding up your process—think hours saved on reports or routine tasks. If you’ve hesitated because you’re not a coder, I especially urge you to give it a try—no coding skills needed!