What can you build with a CSV file, some Python, and open data? Quite a lot, actually. With free public transport for under-18s across Melbourne, the question isn’t just where trains go — it’s what’s actually reachable from each station. By combining open datasets from Public Transport Victoria and OpenStreetMap, we used Python isochrones and a React interface to show what’s walkable in 15 minutes via real streets — not straight lines. This is the power of open data: turning public infrastructure into real opportunity. 👉 Read more here: https://lnkd.in/gDEeyN8E #OpenData #DataVisualisation #Python #UrbanTech #TransportData #Accessibility
Unlocking Melbourne's Transport Network with Open Data
More Relevant Posts
-
Level up your data stack! From Polars for speed to Great Expectations for quality, here are 8 essential Python libraries every Data Engineer needs to build faster, more resilient pipelines. What’s missing from your toolkit? Drop a comment below with the libraries you think every data engineer should be using! 👇 #DataEngineering #Python #BigData #ETL #ELT #DataStack #Pyspark #SoftwareEngineering
To view or add a comment, sign in
-
-
Scale vector search to millions without rewriting your prototype code ⚡ Building semantic search typically starts with storing vectors in Python lists and computing cosine similarity manually. But brute-force comparison scales linearly with your dataset, making every query slower as your data grows. Qdrant is a vector search engine built in Rust that indexes your vectors for fast retrieval. Key features: • In-memory mode for local prototyping with no server setup • Seamlessly scale to millions of vectors in production with the same Python API • Built-in support for cosine, dot product, and Euclidean distance • Sub-second query times even for millions of vectors ☕️ Run this code: https://bit.ly/4cCI76w #VectorDatabase #Python #SemanticSearch #DataScience
To view or add a comment, sign in
-
-
📊 Screen Time Analysis Project 1. Explored mobile usage patterns by analyzing screen time data using Python and Pandas. 2. Identified the most used apps, calculated daily screen time, and visualized insights using Matplotlib. 3. A simple project showing how data can reveal real user behavior and trends. *****Tools: Python | Pandas | Matplotlib | Jupyter Notebook******** #DataAnalytics #Python #Pandas #DataVisualization #DataAnalysis
To view or add a comment, sign in
-
Day 1 – Rebuilding Python Fundamentals from Scratch 🚀 Today I focused on deeply understanding variables and core numeric behavior in Python. Here’s what I covered: • Variables are references to objects in memory, not boxes storing values • Core data types: int, float, str, bool • Type casting and why int(x) doesn’t modify the original variable unless reassigned • Difference between / (true division) and // (floor division) • Why floor division moves LEFT on the number line for negative numbers • The mathematical identity behind modulus: a = (a // b) * b + (a % b) • Why -17 % 4 = 3 (not -1) • Why id(x) == id(y) can return True due to small integer interning Big insight: Understanding memory behavior and arithmetic rules removes confusion and prevents hidden bugs. Focusing on strong foundations before moving ahead. On to Day 2 💪 #Python #DataScience #MachineLearning #SoftwareEngineering #LearningInPublic
To view or add a comment, sign in
-
-
Raw data is just noise until you give it a shape. 🌦️📉 Fetched the live API with `requests`. Cleaned the chaos with `pandas`. Painted the story with `matplotlib`. We aren't just looking at the weather app anymore; we’re extracting the data and building our own. When you know Python, the whole internet is just a database waiting to be visualized. 🐍✨ What’s your go-to visualization library: Matplotlib, Seaborn, or something else? 👇 #Python #DataScience #Matplotlib #100DaysOfCode
To view or add a comment, sign in
-
Hi guys, what the lord has done for me, i cannot tell it all. I remember when I started learning how to code with nothing but an on-screen keyboard, a book, and a pen. To make it even more interesting, my first programming language was R. If you're interested in getting into data analysis through programming, this notebook provides a step-by-step walkthrough of the data wrangling process using Python, covering how raw data is inspected, cleaned, and prepared for further analysis. https://lnkd.in/dypNihrG #DataAnalytics #PythonForDataScience #Python #DataWrangling #DataPortfolio #Kaggle
To view or add a comment, sign in
-
The new https://lnkd.in/edQRQ_rZ site has over 15000 events within 30 miles of over 7000 cities and towns in the USA, on the next pass we will do something about our lack of listings for rural America. The visualization came from https://go.guideants.ai/ which is great for doing all kinds of analysis and visualization of the data in SQLite db in a python sandbox.
To view or add a comment, sign in
-
-
✅Day 9 – For Loops in Python Today I learned about For Loops in Python. A for loop allows us to repeat a task multiple times automatically. ✅Example: numbers = [10, 20, 30] for num in numbers: print(num) This loop prints each value from the list one by one. ✅Why This Matters in Data Analytics -- In real-world data analysis, we often need to: -- Process large datasets -- Perform repeated calculations -- Apply the same operation to many values -- Loops help automate these repetitive tasks efficiently. ✅Today's takeaway: Automation is a key skill in data analytics, and loops make it possible. #Python #DataAnalytics #LearningJourney #BusinessAnalytics #Consistency
To view or add a comment, sign in
-
-
Ever wonder how Python, a typically "slow" language, manages to power the heavy lifting of modern Data Science? The answer is almost always NumPy! 😅 This graphic perfectly captures the "how" behind NumPy’s speed boost: Efficient C Arrays: It swaps out standard Python objects for streamlined C arrays "under the hood". Vectorized Operations: NumPy supports operations that apply to entire datasets at once, rather than iterating one-by-one. Reduced Looping: This massive performance bottleneck is bypassed with lightning-fast array calculations. Optimized Memory: It manages memory usage and boosts performance, crucial for working with large datasets. What’s your favorite vectorized NumPy function that absolutely transformed your code’s performance? Let's geek out in the comments! 👇 #DataScience #Python #NumPy #PerformanceComputing #CodingTips
To view or add a comment, sign in
-
-
🚀 Day 2 – Data Science Learning Journey Today’s session was all about Matplotlib, one of the most important libraries for data visualization in Python. I explored various functions used to create different types of graphs and plots. It was really interesting to see how raw data can be transformed into meaningful visual insights, making patterns and trends much easier to understand. Every step in this journey is helping me understand how data tells a story through visualization. 📊 #DataScience #Python #Matplotlib #DataVisualization #LearningJourney
To view or add a comment, sign in
More from this author
Explore content categories
- Career
- Productivity
- Finance
- Soft Skills & Emotional Intelligence
- Project Management
- Education
- Technology
- Leadership
- Ecommerce
- User Experience
- Recruitment & HR
- Customer Experience
- Real Estate
- Marketing
- Sales
- Retail & Merchandising
- Science
- Supply Chain Management
- Future Of Work
- Consulting
- Writing
- Economics
- Artificial Intelligence
- Employee Experience
- Workplace Trends
- Fundraising
- Networking
- Corporate Social Responsibility
- Negotiation
- Communication
- Engineering
- Hospitality & Tourism
- Business Strategy
- Change Management
- Organizational Culture
- Design
- Innovation
- Event Planning
- Training & Development