🚀 Day 50 of #100DaysOfCode — Filtering Even Numbers from an Array Hey everyone! 👋 Today’s challenge was about data filtering — extracting only the even numbers from an array and returning them in a new list. 👨💻 What I practiced today: ✅ Iterating through arrays with loops ✅ Using the modulo operator (%) to check divisibility ✅ Conditionally building a new array ✅ Understanding the difference between filtering and modifying 📌 Today's Task: ✔ Create a function get_evens(arr) ✔ Return a new array containing only even numbers ✔ Keep the original array unchanged 🧠 Example: Input: [1, 2, 3, 4, 5, 6] Output: [2, 4, 6] Input: [1, 3, 5] Output: [] 💡 Key Takeaway: Filtering is a fundamental data processing skill. Whether you're cleaning datasets, preparing inputs for algorithms, or simply extracting meaningful information, knowing how to filter efficiently is key to writing clear and performant code. #100DaysOfCode #Python #Arrays #Filtering #EvenNumbers #DataProcessing #CodingJourney #Day50
Pradumya Gupta’s Post
More Relevant Posts
-
🧹 Data Cleaning with Pandas 🐼📊 Real-world data is messy — and cleaning it is non-negotiable. In this carousel, I’ve shared how to handle: ✔️ Empty / NaN values ✔️ Removing vs replacing missing data ✔️ Using dropna() and fillna() ✔️ Mean, Median & Mode based replacement 📌 Remember: Good analysis starts with clean data. 👉 Swipe through to learn practical Pandas data cleaning techniques. #Pandas #DataCleaning #Python #DataAnalysis #DataScience #EDA #CleanData #DataEngineering
To view or add a comment, sign in
-
🐍 Data Aggregation Using pandas groupby() Once data is clean, the next step is summarizing it to uncover meaningful insights — and groupby() in pandas makes this powerful and simple. Here’s what I’m practicing with groupby() 👇 • Grouping data by categories (like region, product, or date) • Calculating totals, averages, and counts • Comparing performance across groups • Turning detailed data into summary-level insights groupby() helps transform large datasets into clear, business-ready information with just a few lines of code. #Python #Pandas #GroupBy #DataAggregation #DataAnalysis #EDA #LearningJourney #AspiringDataAnalyst #DataCommunity
To view or add a comment, sign in
-
📊 From Data Exploration to Data Cleaning with Pandas Real-world datasets are rarely clean and that’s where data cleaning becomes critical. In this step, I moved from simply viewing the data to actually understanding its quality by working with sample values and Pandas code. 🔍 What I practiced: ✔ Exploring the dataset using df.head() ✔ Detecting missing values with df.isnull().sum() ✔ Identifying empty columns ✔ Cleaning the data using dropna() This small exercise clearly showed how missing values can impact analysis and why cleaning is a must before any modeling or insights. Clean data isn’t optional. It’s the foundation of reliable insights 🚀 #Pandas #Python #DataCleaning #DataScienceJourney #LearningInPublic #DataAnalytics
To view or add a comment, sign in
-
-
Exploring Python inside Excel highlighted something important for me: The real value of a tool isn’t its technical power—it’s how effectively others can use it. When advanced analytics live inside a familiar platform like Excel: Insights move faster to decision‑makers Processes become easier to standardize and repeat Less effort goes into “how,” more into “why and what next” I’m increasingly interested in designing workflows that scale insight—not just execution. That mindset shift is what excites me most about Python in Excel. #GrowthMindset #Analytics #PythonInExcel #DataThinking #CareerDevelopment
To view or add a comment, sign in
-
-
Column transformation + groupby changed how I analyze data 📊 Raw data doesn’t give insights. Prepared data does. While working with Pandas, I realized how powerful simple column transformations are: • Cleaning percentage columns and converting them to numeric • Creating new logic-based columns (BONUS vs NO BONUS) • Adding derived columns instead of touching raw data Once the columns made sense, groupby unlocked the patterns. Grouping by department and aggregating values revealed insights that were invisible at the row level. Big lesson: ➡️ Clean columns first ➡️ Group second ➡️ Insights follow Question for data folks: Do you transform your columns before groupby — or learn this the hard way? 😅 #DataAnalytics #Python #Pandas #GroupBy #LearningInPublic
To view or add a comment, sign in
-
-
DSA — LeetCode #760: Anagram Mapping I worked on a classic Data Structures & Algorithms problem and added it to my LeetCode journey. Problem Statement Given two integer arrays s and t, where t is an anagram of s, return an array mapping such that: mapping[i] is the index in t where the value s[i] appears. If a value occurs multiple times, any valid index can be returned. Approach Use a hash map to store indices of elements in array t. Traverse t and map each value to its index (or list of indices). Iterate through array s and fetch the corresponding index from the map. Build the result array using these indices. Example Input: s = [16, 5, 12, 10] t = [5, 10, 12, 16] Output: [3, 0, 2, 1] Complexity ✅ Time Complexity: O(n) ✅ Space Complexity: O(n) #DSA #LeetCode #ProblemSolving #Python #CodingInterview #Algorithms #DataStructures #HashMap #Anagram #CodingPractice #TechJourney #SoftwareEngineering #InterviewPrep #LearnToCode
To view or add a comment, sign in
-
-
#Day7 of #100DaysofAI Today I dived deeper into Pandas Foundations. Today was about getting more comfortable with Pandas core data structures and building a solid base for real-world data work. Learned how a Series behaves like a labeled 1D array (similar to a single column) and a DataFrame as a 2D table with rows and columns, which is the main structure used for analytics and ML preprocessing. Big takeaway: understanding Series vs DataFrame and getting comfortable with loading, exploring, and cleaning CSVs is the gateway to proper EDA and feature engineering in any ML project. Every serious data pipeline starts with clean, well-structured DataFrames. #AIJourney #LearningInPublic #Python #Pandas #DataCleaning #BuildInPublic
To view or add a comment, sign in
-
One concept I understand better now is List Comprehensions. Sometimes I felt like my python scripts were getting cluttered with repetitive for loops and .append() calls. I used to think list comprehensions were just a shorthand trick, but they are actually a more efficient way to handle data transformation. It’s the difference between writing a paragraph and writing a perfectly concise sentence. The syntax used to look intimidating, but here is how it finally clicked for me: new_list = [expression for item in iterable if condition] The Expression: What do you want to keep? (e.g., x or x**2) The Loop: Where is it coming from? (e.g., for x in my_list) The Filter: Do you want all of them or just some? (e.g., if x > 10) Why I’m using it from now on: Readability: Once you know the syntax, you can read the logic in a single line. Speed: It’s technically faster than a standard loop. Less Room for Error: No need to initialize empty lists or manage counters. Small shifts in syntax lead to big jumps in code quality. NativesPlug @locus.ioe #LOCUS2026 #NativesPlug #LearningChallenge #15DayChallenge #NepalTech #LearnAndWin #PythonTips
To view or add a comment, sign in
-
How Does Strikethrough in Excel Work Written by $DiligentTECH💀⚔️ Does your spreadsheet feel like a graveyard of forgotten promises? Have you ever gazed at a cell and felt the weight of a SyntaxError in your soul, wishing you could just... let go? https://lnkd.in/dy_zauDp In the standard script of life, not every line of code deserves to be executed. Sometimes, we need to keep the memory of a variable alive while ensuring it no longer impacts our final output. 1. The Syntax of Letting Go In Python, we use # to comment out thoughts that no longer serve the logic of our main function. In the study of Excel, Strikethrough is that elegant horizontal line that whispers, "I remember you, but you are no longer my reality." It isn't a del command that erases the data forever; it’s a visual False boolean. https://lnkd.in/daeRnb93
To view or add a comment, sign in
-
-
📊 Pandas groupby() — an underrated superpower for analysts Most business insights come from grouping data the right way. Why it matters: • Speeds up analysis • Eliminates repetitive manual work • Reduces reporting errors • Delivers quick, decision-ready insights If you work with large datasets, groupby() is a game changer. What’s your favorite groupby() use case? #BusinessAnalyst #Python #Analytics #Learning #DataAnalysis
To view or add a comment, sign in
-
Explore content categories
- Career
- Productivity
- Finance
- Soft Skills & Emotional Intelligence
- Project Management
- Education
- Technology
- Leadership
- Ecommerce
- User Experience
- Recruitment & HR
- Customer Experience
- Real Estate
- Marketing
- Sales
- Retail & Merchandising
- Science
- Supply Chain Management
- Future Of Work
- Consulting
- Writing
- Economics
- Artificial Intelligence
- Employee Experience
- Workplace Trends
- Fundraising
- Networking
- Corporate Social Responsibility
- Negotiation
- Communication
- Engineering
- Hospitality & Tourism
- Business Strategy
- Change Management
- Organizational Culture
- Design
- Innovation
- Event Planning
- Training & Development