Day 2 of my Python Full Stack journey. ✅ Today I covered the very first building block of Python: → Variables → Data Types (int, float, str, bool) → f-strings to print dynamic output Looks simple. But this is the foundation everything else is built on. Here's what I actually typed today: name = "Punith" # str age = 24. # int score = 9.5 # float is_dev = True # bool print("Hi, I'm {Punith}, age {24}") and many. One thing that clicked today: Python figures out the data type automatically. No need to declare it like in Java or C. This is called dynamic typing — and it makes Python so much cleaner to write. 45 minutes. Committed to GitHub. Showing up again tomorrow. If you're learning to code right now — what was the first concept that actually made sense to you? #PythonFullStack #Day2 #BuildingInPublic #100DaysOfCode #Bangalore
Python Full Stack Day 2: Variables and Data Types
More Relevant Posts
-
🧠 How Python Works Internally (Big Picture Explained) Python is one of the most popular programming languages today, used in web development, data science, automation, and artificial intelligence. But most developers only scratch the surface. If you truly want to master Python, you need to understand what happens behind the scenes when your code runs. The Big Picture When you run a Python program, it goes through this pipeline: Source Code → Bytecode → Python Virtual Machine → Output Unlike languages such as C or C++, Python does not directly execute your code or compile it into machine code. When you run: x = 5 print(x) Internally: 1. Code is tokenized 2. Converted to AST 3. Compiled to bytecode 4. Executed by PVM 5. Memory managed automatically Have you ever explored Python internals before? Which concept surprised you the most? Let’s discuss in the comments 👇
To view or add a comment, sign in
-
-
🚀 Day 20 ------- Integrating Python with SQLite 🐍🗄️ Today I explored how to integrate Python with SQLite — and honestly, it felt like unlocking real-world application power! 🔹 What I learned: SQLite comes built-in with Python (no extra installation needed) How to connect to a database using sqlite3 Creating tables using SQL inside Python Inserting and retrieving data using queries SQLite is lightweight and perfect for beginners to understand how databases work in real applications. It helps store and manage data efficiently in Python programs. () 💡 Key takeaway: Python + SQLite = Simple yet powerful combo for building real-world projects like student management systems, blogs, and more #Python #SQLite #10000Coders #LearningJourney #Coding #Database #BackendDevelopment
To view or add a comment, sign in
-
-
𝗪𝗵𝘆 𝗱𝗼𝗲𝘀 𝗣𝘆𝘁𝗵𝗼𝗻 𝗰𝗼𝗱𝗲 𝗳𝗲𝗲𝗹𝘀 𝘀𝗹𝗼𝘄 𝗱𝗲𝘀𝗽𝗶𝘁𝗲 𝘂𝘀𝗶𝗻𝗴 𝗺𝘂𝗹𝘁𝗶𝗽𝗹𝗲 𝘁𝗵𝗿𝗲𝗮𝗱𝘀 ? The secret lies in how Python handles execution. I’ve put together a 12-slide deep dive into Python Concurrency, moving from absolute basics to the future of Python 3.13. What’s inside? ✅ Synchronous vs. Async: Why "𝘄𝗮𝗶𝘁𝗶𝗻𝗴" is the biggest bottleneck. ✅ The Event Loop: How 𝗮𝘀𝘆𝗻𝗰𝗶𝗼 manages thousands of tasks on a single thread. ✅ The 𝗚𝗜𝗟 (𝗚𝗹𝗼𝗯𝗮𝗹 𝗜𝗻𝘁𝗲𝗿𝗽𝗿𝗲𝘁𝗲𝗿 𝗟𝗼𝗰𝗸): Why traditional Python threading isn't always "parallel." ✅ The 𝗙𝘂𝘁𝘂𝗿𝗲 (𝗙𝗿𝗲𝗲-𝗧𝗵𝗿𝗲𝗮𝗱𝗶𝗻𝗴): How Python 3.13+ finally enables true multi-core parallelism. 🟪 𝗧𝗵𝗲 "𝗞𝗶𝘁𝗰𝗵𝗲𝗻" 𝗔𝗻𝗮𝗹𝗼𝗴𝘆: Think of a single cook (Thread) multitasking between a gas stove (I/O) and a cutting board. That’s Async. Now imagine a kitchen with multiple cooks and multiple gas stoves. That’s Modern Free-Threading. Whether you're building 𝘄𝗲𝗯 𝘀𝗰𝗿𝗮𝗽𝗲𝗿𝘀 (𝗜/𝗢-𝗯𝗼𝘂𝗻𝗱) or 𝗵𝗲𝗮𝘃𝘆 𝗱𝗮𝘁𝗮 𝗽𝗶𝗽𝗲𝗹𝗶𝗻𝗲𝘀 (𝗖𝗣𝗨-𝗯𝗼𝘂𝗻𝗱), choosing the right model is key to performance. Check out the slides below! #Python #Programming #SoftwareEngineering #Concurrency #AsyncIO #Multithreading #Python313 #TechLearning
To view or add a comment, sign in
-
The moment you add a new condition to an if/else chain in your pipeline, you’ve modified existing code! That’s exactly the smell OCP is meant to catch. 🔍 Open for extension, closed for modification means adding new behavior by writing new code, not patching old one! In my latest article, I break down what OCP looks like in Python data projects: strategy patterns, abstract base classes, and the tradeoffs of when the extra structure is actually worth it. https://lnkd.in/eE9yicvC This is Part 2 of my SOLID series. What’s the most painful “just add another elif” you’ve ever had to maintain? 😅 #Python #SoftwareEngineering #SOLID #DataScience #CleanCode
To view or add a comment, sign in
-
Let's now talk about Variables in Python. What is a variable? Think of it like a box, you give it a name and store a value inside it. Example: a = 10 name = 'Alice' price = 19.99 Here, a, name, price are all variables in which we have stored some value or data. Simple right? But here's where most beginners make mistakes- naming their variables wrong. There are 3 conventions you need to follow: 1️⃣ First letter should be lowercase (best practice as per PEP8) 2️⃣ Never start a variable name with a number 3️⃣ Never use spaces, use an underscore instead Break any of these and Python will throw an error before your code even runs. Get these right from day one and you'll save yourself a lot of frustration later. #Python #DataAnalytics #data #python #learnpython #dataanalyst #pythonforbeginners
To view or add a comment, sign in
-
-
We have written a python Qt program to load multiple las files representing different run on the same well, depth shift each run to the Base run and then merge these files into a merged las file – all available on GitHub We have just released our beta version of our Merge petrophysical software written in python using Qt. For me, this has really been a time saver. Give it a try. What it can do today: · Load multiple LAS files · Display multiple runs on a typical Andy McDonald depth plot (Red box on depth plot representing the extent a GR-type tool available in each run). · Choose your base run and then each run to be shifted to the Base run. · Determine numerous automatic depth shifts for each run or you can do this manually too · Save depth shifted data for that run and then go to the next run · After all runs have been depth shifted to the Base run, then the program merges all files and save a single merged las file to be used later on. We would still like to add the capabilities to trim each run, estimate missing curves and QC all data. This is the link on GitHub: https://lnkd.in/gHVEVdsC and to launch this program use this command from the app’s root directory; python -m apps.merge_gui.main #Petrophysics #Python #OpenSource #ReservoirCharacterization #DataScience #ArtificialIntelligence
To view or add a comment, sign in
-
-
Day 5 of my Python learning journey — diving deeper into OOP Today I implemented two core classes: BankAccount and InventoryItem, with a focus on clean design and proper encapsulation. A key takeaway: Validate first, then modify state — never the other way around. I also caught a few common pitfalls early: • Exposing internal attributes directly • Performing validation after state changes • Duplicating logic across methods This exercise reinforced an important principle: a well-designed class should be self-protecting. External code should interact with it safely through defined interfaces, without risking inconsistent state. Feeling more confident with structuring real-world logic using classes and applying these concepts in API design with FastAPI. GitHub Repository: https://lnkd.in/dqBZtfpM #Python #OOP #FastAPI #SoftwareEngineering #CleanCode #LearningInPublic
To view or add a comment, sign in
-
-
⏰ Day 56 of My Python Journey – Building an Alarm Clock Today I combined my knowledge of date & time manipulation with automation and text-to-speech to create a simple alarm program in Python. 🔹 What I built: An alarm that checks the current time continuously using the datetime module. When the set alarm time matches, it triggers a voice alert using the pyttsx3 library. Added a loop to repeat the spoken message multiple times for emphasis. 🔹 Key Learnings: How to integrate multiple modules (datetime, time, pyttsx3) to solve a real-world problem. The importance of continuous loops and condition checks in automation tasks. How text-to-speech can make Python programs more interactive and user-friendly. ✨ Reflection: Crossing Day 56 feels exciting because I’m now building programs that connect directly to everyday life. From simple algorithms to now creating an alarm clock, Python is proving to be a versatile tool for both problem-solving and practical applications. #Python #Day56 #LearningJourney #Automation #TextToSpeech #CodingConsistency #ProblemSolving
To view or add a comment, sign in
-
Day 5 of my Python Full Stack journey. ✅ Today's topic: Functions — write once, use anywhere. This is where Python starts feeling like real programming. Instead of copying the same code 10 times — you wrap it in a function and call it. Here's what I typed today: def greet(name, role="Developer"): return f"Hey {name}, future {role}!" msg = greet("Punith") print(msg) # Output: Hey Punith, future Developer! Biggest lesson today: Default arguments make functions flexible. You only pass them if you want to override the default. Small thing. But it made everything click. — Here's what I covered in 5 days: → Variables & Data Types → Conditionals → Loops → Functions → Built a Calculator from scratch — pushed to GitHub ✅ 45 minutes a day. No excuses. #PythonFullStack #Day5 #Week1Done #BuildingInPublic #100DaysOfCode #Bangalore
To view or add a comment, sign in
-
-
I didn’t really understand NumPy until I asked a simple question: Why was it even created in the first place? Python, by design, is flexible and easy to use… but that flexibility comes at a cost. When developers started using Python for scientific computing and data-heavy tasks, they ran into real problems: * Working with large numerical data was slow * Memory usage was inefficient * Simple operations required too many loops and too much code And that’s exactly where NumPy came in. It wasn’t created to “add features” to Python — it was created to fix a bottleneck. NumPy introduced a new way of handling data: A structured, typed array that allows computations to happen at a much lower level (closer to C speed), while still writing code in Python. So instead of telling Python how to loop through every element… you just tell NumPy what operation you want — and it handles the rest efficiently. That shift is the real innovation. NumPy is not just about arrays. It’s about changing the way computation is done in Python — from step-by-step instructions to vectorized thinking. And that’s why it became the foundation of everything that came after it. #Python #NumPy #DataScience #MachineLearning #SoftwareEngineering
To view or add a comment, sign in
-
Explore content categories
- Career
- Productivity
- Finance
- Soft Skills & Emotional Intelligence
- Project Management
- Education
- Technology
- Leadership
- Ecommerce
- User Experience
- Recruitment & HR
- Customer Experience
- Real Estate
- Marketing
- Sales
- Retail & Merchandising
- Science
- Supply Chain Management
- Future Of Work
- Consulting
- Writing
- Economics
- Artificial Intelligence
- Employee Experience
- Workplace Trends
- Fundraising
- Networking
- Corporate Social Responsibility
- Negotiation
- Communication
- Engineering
- Hospitality & Tourism
- Business Strategy
- Change Management
- Organizational Culture
- Design
- Innovation
- Event Planning
- Training & Development
Inspiring journey 👏 Growth comes from small consistent efforts 💯 Also learning and sharing—feel free to follow 🚀