🚨 𝗣𝘆𝘁𝗵𝗼𝗻 𝗙𝘂𝗻 𝗙𝗮𝗰𝘁 𝗗𝗮𝘆 𝟴 Python can remove duplicates… in ONE line 🔥 𝚗𝚞𝚖𝚜 = [𝟷, 𝟸, 𝟸, 𝟹, 𝟺, 𝟺] 𝚞𝚗𝚒𝚚𝚞𝚎_𝚗𝚞𝚖𝚜 = 𝚕𝚒𝚜𝚝(𝚜𝚎𝚝(𝚗𝚞𝚖𝚜)) 𝚙𝚛𝚒𝚗𝚝(𝚞𝚗𝚒𝚚𝚞𝚎_𝚗𝚞𝚖𝚜) 𝗢𝘂𝘁𝗽𝘂𝘁 (𝗼𝗿𝗱𝗲𝗿 𝗺𝗮𝘆 𝘃𝗮𝗿𝘆): [1, 2, 3, 4] Wait… that’s it?? Yup. 𝚜𝚎𝚝() automatically removes duplicates Because sets only store 𝘂𝗻𝗶𝗾𝘂𝗲 𝘃𝗮𝗹𝘂𝗲𝘀 𝗕𝘂𝘁 𝘁𝗵𝗲𝗿𝗲’𝘀 𝗮 𝗰𝗮𝘁𝗰𝗵 Sets are 𝘂𝗻𝗼𝗿𝗱𝗲𝗿𝗲𝗱, so you might get: [𝟸, 𝟷, 𝟺, 𝟹] If you want order preserved: 𝚗𝚞𝚖𝚜 = [𝟷, 𝟸, 𝟸, 𝟹, 𝟺, 𝟺] 𝚞𝚗𝚒𝚚𝚞𝚎_𝚗𝚞𝚖𝚜 = 𝚕𝚒𝚜𝚝(𝚍𝚒𝚌𝚝.𝚏𝚛𝚘𝚖𝚔𝚎𝚢𝚜(𝚗𝚞𝚖𝚜)) 𝚙𝚛𝚒𝚗𝚝(𝚞𝚗𝚒𝚚𝚞𝚎_𝚗𝚞𝚖𝚜) 𝗢𝘂𝘁𝗽𝘂𝘁: [𝟷, 𝟸, 𝟹, 𝟺] 𝗥𝗲𝗮𝗹-𝘄𝗼𝗿𝗹𝗱 𝘂𝘀𝗲 𝗰𝗮𝘀𝗲𝘀: • Data cleaning (your daily scene 😏) • Removing duplicate users/emails • Preprocessing datasets for ML 𝗣𝘆𝘁𝗵𝗼𝗻 𝗯𝗲 𝗹𝗶𝗸𝗲: “𝘒𝘢𝘮 𝘤𝘰𝘥𝘦, 𝘻𝘺𝘢𝘥𝘢 𝘬𝘢𝘢𝘮” 📌 𝗗𝗮𝘆 𝟵 𝗰𝗼𝗺𝗶𝗻𝗴 𝘁𝗼𝗺𝗼𝗿𝗿𝗼𝘄: A comparison trick that looks illegal… but works perfectly 😳 Follow for more “yeh itna easy tha?” moments 😄 #Python #Programming #Developers #Coding #AI #DataScience #LearnPython
Uman Sheikh’s Post
More Relevant Posts
-
🚨 𝗣𝘆𝘁𝗵𝗼𝗻 𝗙𝘂𝗻 𝗙𝗮𝗰𝘁 𝗗𝗮𝘆 𝟲 You don’t need a temp variable to swap values 😏 𝚊 = 𝟻 𝚋 = 𝟷𝟶 𝚊, 𝚋 = 𝚋, 𝚊 𝚙𝚛𝚒𝚗𝚝(𝚊, 𝚋) 𝗢𝘂𝘁𝗽𝘂𝘁: 𝟷𝟶 𝟻 𝗪𝗮𝗶𝘁… 𝗵𝗼𝘄 𝗱𝗶𝗱 𝘁𝗵𝗮𝘁 𝗲𝘃𝗲𝗻 𝘄𝗼𝗿𝗸? In many languages, you’d do: 𝚝𝚎𝚖𝚙 = 𝚊 𝚊 = 𝚋 𝚋 = 𝚝𝚎𝚖𝚙 But Python says: “3 lines? Nah… 1 hi kaafi hai” 😌 𝗪𝗵𝗮𝘁’𝘀 𝗵𝗮𝗽𝗽𝗲𝗻𝗶𝗻𝗴 𝘂𝗻𝗱𝗲𝗿 𝘁𝗵𝗲 𝗵𝗼𝗼𝗱: Python creates a tuple behind the scenes: (𝚊, 𝚋) = (𝚋, 𝚊) Then it unpacks values back into variables. 𝗪𝗵𝘆 𝘁𝗵𝗶𝘀 𝗶𝘀 𝗽𝗼𝘄𝗲𝗿𝗳𝘂𝗹: • Cleaner code • Less chance of bugs • Widely used in real-world Python 𝗕𝗼𝗻𝘂𝘀 𝘁𝗿𝗶𝗰𝗸: You can swap more than 2 variables too: 𝚊, 𝚋, 𝚌 = 𝟷, 𝟸, 𝟹 𝚊, 𝚋, 𝚌 = 𝚌, 𝚊, 𝚋 𝚙𝚛𝚒𝚗𝚝(𝚊, 𝚋, 𝚌) 𝗣𝘆𝘁𝗵𝗼𝗻 𝗯𝗲 𝗹𝗶𝗸𝗲: “Shortcut bhi elegant hona chahiye” 📌 𝗗𝗮𝘆 𝟳 𝗰𝗼𝗺𝗶𝗻𝗴 𝘁𝗼𝗺𝗼𝗿𝗿𝗼𝘄: Two variables that look the same… but behave totally differently 😈 Follow for more “yeh itna simple tha??” moments 😄 #Python #Programming #Developers #Coding #AI #DataScience #LearnPython
To view or add a comment, sign in
-
-
🚨 𝗣𝘆𝘁𝗵𝗼𝗻 𝗙𝘂𝗻 𝗙𝗮𝗰𝘁 𝗗𝗮𝘆 𝟱 A tuple is 𝘪𝘮𝘮𝘶𝘵𝘢𝘣𝘭𝘦… right? 👀 Then explain THIS 😳 𝚝 = ([𝟷, 𝟸, 𝟹], [𝟺, 𝟻]) 𝚝[𝟶].𝚊𝚙𝚙𝚎𝚗𝚍(𝟿𝟿) 𝚙𝚛𝚒𝚗𝚝(𝚝) 𝗘𝘅𝗽𝗲𝗰𝘁𝗲𝗱: Error because tuple is immutable 𝗔𝗰𝘁𝘂𝗮𝗹 𝗢𝘂𝘁𝗽𝘂𝘁: ([𝟷, 𝟸, 𝟹, 𝟿𝟿], [𝟺, 𝟻]) 𝗪𝗵𝗮𝘁’𝘀 𝗴𝗼𝗶𝗻𝗴 𝗼𝗻? Yes, tuples are immutable… BUT: They only protect the 𝘀𝘁𝗿𝘂𝗰𝘁𝘂𝗿𝗲, not the 𝗰𝗼𝗻𝘁𝗲𝗻𝘁𝘀 So: • You can’t change the tuple itself • But you can modify objects inside it In this case: • The list inside the tuple is still mutable • So .append() works perfectly fine 𝗦𝗶𝗺𝗽𝗹𝗲 𝗯𝗿𝗲𝗮𝗸𝗱𝗼𝘄𝗻: Tuple = locked container But items inside = free to move 𝗪𝗵𝘆 𝘁𝗵𝗶𝘀 𝗺𝗮𝘁𝘁𝗲𝗿𝘀: • Can lead to unexpected data changes • Important in nested data structures • Common confusion in interviews 😅 𝗣𝘆𝘁𝗵𝗼𝗻 𝗯𝗲 𝗹𝗶𝗸𝗲: “Main bahar se strict hoon… andar se flexible” 📌 𝗗𝗮𝘆 𝟲 𝗰𝗼𝗺𝗶𝗻𝗴 𝘁𝗼𝗺𝗼𝗿𝗿𝗼𝘄: A clean Python trick that replaces 3 lines with just 1 😎 Follow for more “yeh allowed hai??” moments 😄 #Python #Programming #Developers #Coding #AI #DataScience #LearnPython
To view or add a comment, sign in
-
-
Day-7 Python + AI: Understanding Tuples for Efficient Data Handling Tuples are an important data type in Python, especially useful in AI when working with fixed and secure data. Why Tuples Matter in AI - Immutable (cannot be changed), ensuring data integrity - Faster than lists for fixed data - Useful for storing coordinates, labels, and structured data Example Program # Using tuple in an AI-like scenario data = (1, 2, 3, 4, 5) # Simple processing result = tuple(x * 2 for x in data) print("Original Data:", data) print("Processed Data:", result) Benefits of Using AI with Python - Ensures data safety with immutable structures - Improves performance in data handling - Simple and efficient for structured datasets - Scalable for real-world AI applications Understanding tuples helps in building reliable and efficient AI systems using Python. #Python #AI #MachineLearning #DataScience #Programming
To view or add a comment, sign in
-
Machine Learning Project Update I’ve recently built and published a Machine Learning project on GitHub using Python. The main idea of this project is to train a model that learns patterns from input data and uses them to make predictions on new, unseen data. What the model does: • Takes input data and analyzes patterns in it • Learns relationships between features during training • Uses what it learned to predict outcomes on new data In simple terms, the model improves its accuracy by learning from examples, just like recognizing patterns over time. This project helped me understand how machine learning models actually “learn” from data and make decisions based on it. GitHub Repository: [https://lnkd.in/eCHXNHhC] I’m continuing to learn and build more projects in AI/ML. Feedback and collaboration are always welcome. #ML #Github #Python
To view or add a comment, sign in
-
python operators You're running through arithmetic, assignment, and relational ops with `n1=40, n2=30` initially. A few things to note from your output: *Arithmetic section* - works as expected: - *Addition*: 40 + 30 = 70 - *diff*: 40 - 30 = 10 - *mult*: 40 * 30 = 1200 - *div*: 40 / 30 = 1.333... - *mod*: 40 % 30 = 10 - *floordiv*: 40 // 30 = 1 *Assignment section* - this is where it gets tricky: n1 += n2 # n1 becomes 70 print(n1) # 70 n1 -= n2 # n1 back to 40 print(n2) # 30, n2 never changed n1 *= n2 # n1 becomes 1200 print(n2) # still 30 Notice: you're printing `n2` after modifying `n1`. So `n2` stays 30 the whole time. That's why you see 30 printed repeatedly. Later you do `n1=120` and `n1=100`, so the relational tests are using 120 vs 100, not the original 40 and 30. *Relational section*: 120 == 100 → False, 120 != 100 → True, etc. All correct.
To view or add a comment, sign in
-
-
Today was a valuable learning experience focused on AI with Python. I developed a project that analyzes resumes against job descriptions, providing an analysis with a matching score. In addition, I created a simple user interface using Streamlit and Markdown in Python, utilizing IDE Visual Studio Code along with langchain, Vector db Chroma, Embedding, and Google Gemini. Here are the main AI tools used: - Loaders: from langchain_community.document_loaders import PyPDFLoader, Docx2txtLoader, TextLoader - Splitters: from langchain_text_splitters import RecursiveCharacterTextSplitter - Vector database: from langchain_community.vectorstores import Chroma - Gemini model: from langchain_google_genai import ChatGoogleGenerativeAI, GoogleGenerativeAIEmbeddings - Embedding model: embedding = GoogleGenerativeAIEmbeddings(model="models/gemini-embedding-001") - LLM Model: llm = ChatGoogleGenerativeAI(model="models/gemini-2.5-flash", temperature=0.3)
To view or add a comment, sign in
-
-
#Mathematics #Python #NumberTheory #Primes #Algorithms #Coding #DataScience Why do Prime Numbers "follow" the number e? There is a deep connection between Prime Numbers and the Euler constant e (2.718...). At first glance, they seem unrelated—discrete building blocks of integers vs. continuous exponential growth. But the Least Common Multiple (LCM) of the range 1 to n reveals the connection. The LCM is the "Prime Blueprint"—the smallest number that contains the highest power of every prime building block needed to construct every integer from 1 to n. As n grows, the value of the LCM tracks with en. Mathematically, this is expressed through two beautiful limits: 1️⃣ lim (n→∞) [ln(lcm(1...n)) / n] = 1 2️⃣ lim (n→∞) [ⁿ√lcm(1...n)] = e To visualize this, we used the Sieve of Eratosthenes—a 2,000-year-old algorithm that finds primes by "sifting." You start at 2 and cross out all its multiples, then move to 3 and do the same. By the time you reach √n, every composite number has been filtered out. How efficient is it? The total operations are roughly O(n log log n) arithmetic operations assuming O(1) array access. It is nearly as fast as just counting the numbers! Key Takeaways from the analysis: ✅ Infinite Recharge: This convergence implies that primes are infinite. If we ran out of primes, the LCM growth would collapse and fall behind en. ✅ The "Staircase" Effect: The LCM only jumps when n hits a prime power (2, 3, 4, 5, 7, 8, 9...). Between these jumps, the integer staircase stays flat while en pulls ahead. ✅ No Infinite Gaps: For the growth to stay linear, primes must appear frequently enough to "kick" the staircase back up to the target line. Mathematics is the art of finding the smooth curve hidden behind a jagged set of integers. The python code to find out the Prime numbers, and the LCM is in the attached report.
To view or add a comment, sign in
-
Launching pyptx — a Python DSL for writing NVIDIA PTX kernels directly. https://lnkd.in/e2yZSjs9 Today I'm open-sourcing a project I've been building on personal time: pyptx, a Python DSL where the function body is the PTX instruction stream. One PTX instruction = one Python call. No optimizer, no autotuner, no tile IR between you and the hardware. Why? Because the newest GPU features — Hopper's wgmma, TMA multicast, mbarrier-based pipelines, Blackwell's tcgen05.mma + TMEM + cooperative 2-SM MMA often only exist at the PTX level. For developers chasing peak performance, that has historically meant writing inline PTX inside CUDA C++. pyptx brings that whole path into Python. Callable from JAX (via typed XLA FFI) and PyTorch (eager, torch.compile, and a C++ extension fast path). A few numbers from real silicon: • H100 bf16 GEMM: 815 TFLOPS, competitive with cuBLAS at matrix sizes ≥ 6K • B200 bf16 GEMM: 1240 TFLOPS on the 1SM kernel • RMSNorm: 2.6 TB/s (88% of HBM3 peak, 3.9× PyTorch eager) • SwiGLU: 2.8 TB/s (94% HBM3) The other half of the project is a transpiler in the opposite direction: python -m pyptx.codegen kernel.ptx --sugar takes PTX from anywhere — nvcc, Triton, CUTLASS output, DeepGEMM kernels — and emits editable pyptx Python. The parser/emitter round-trips byte-identical on 218+ real-world kernels. So you can read someone else's kernel as Python, modify it, and ship the result. Built end-to-end: parser, IR, emitter, transpiler, JAX integration, PyTorch integration, full Hopper + Blackwell ISA coverage, multi-arch wheels published to PyPI. ~17K lines of Python total. Ships with maintained GEMM, grouped GEMM, RMSNorm, LayerNorm, and SwiGLU kernels for both Hopper and Blackwell, plus the PTX → Python transpiler. pip install pyptx[torch] # for PyTorch pip install pyptx[jax] # for JAX pip install pyptx[all] # both Repo: https://lnkd.in/e9hYpPHt Docs: https://pyptx.dev If you write GPU kernels — especially if you've ever wished Triton would let you express a specific wgmma pattern, or wanted to read a CUTLASS PTX dump as editable Python — try it. PRs welcome, especially Blackwell tuning and new ISA wrappers. https://lnkd.in/e9hYpPHt
To view or add a comment, sign in
-
🚨 𝗣𝘆𝘁𝗵𝗼𝗻 𝗙𝘂𝗻 𝗙𝗮𝗰𝘁 𝗗𝗮𝘆 𝟭𝟮 𝘚𝘰𝘮𝘦𝘵𝘪𝘮𝘦𝘴 𝘪𝘴 𝘣𝘦𝘤𝘰𝘮𝘦𝘴 𝘛𝘳𝘶𝘦… 𝘧𝘰𝘳 𝘯𝘰 𝘰𝘣𝘷𝘪𝘰𝘶𝘴 𝘳𝘦𝘢𝘴𝘰𝘯 𝚊 = 𝟸𝟻𝟼 𝚋 = 𝟸𝟻𝟼 𝚙𝚛𝚒𝚗𝚝(𝚊 𝚒𝚜 𝚋) 𝗢𝘂𝘁𝗽𝘂𝘁: 𝚃𝚛𝚞𝚎 𝗡𝗼𝘄 𝘁𝗿𝘆 𝘁𝗵𝗶𝘀: 𝚊 = 𝟸𝟻𝟽 𝚋 = 𝟸𝟻𝟽 𝚙𝚛𝚒𝚗𝚝(𝚊 𝚒𝚜 𝚋) 𝗢𝘂𝘁𝗽𝘂𝘁: 𝙵𝚊𝚕𝚜𝚎 (…𝘸𝘢𝘪𝘵 𝘸𝘩𝘢𝘵??) 𝗪𝗵𝗮𝘁’𝘀 𝗴𝗼𝗶𝗻𝗴 𝗼𝗻? Python does a hidden optimization called 𝘪𝘯𝘵𝘦𝘨𝘦𝘳 𝘤𝘢𝘤𝘩𝘪𝘯𝘨: Numbers from -5 to 256 are pre-stored in memory So Python reuses the same object That’s why: 𝟸𝟻𝟼 𝚒𝚜 𝟸𝟻𝟼 → 𝚃𝚛𝚞𝚎 𝟸𝟻𝟽 𝚒𝚜 𝟸𝟻𝟽 → 𝙵𝚊𝚕𝚜𝚎 (new objects created) 𝗜𝗺𝗽𝗼𝗿𝘁𝗮𝗻𝘁 𝗻𝗼𝘁𝗲: This behavior can vary depending on: • Python version • Environment (REPL, script, etc.) So never rely on this in real code 𝗚𝗼𝗹𝗱𝗲𝗻 𝗿𝘂𝗹𝗲: Use == for value comparison Use 𝚒𝚜 only for identity (like None) 𝚒𝚏 𝚡 𝚒𝚜 𝙽𝚘𝚗𝚎: # 𝚌𝚘𝚛𝚛𝚎𝚌𝚝 𝗣𝘆𝘁𝗵𝗼𝗻 𝗯𝗲 𝗹𝗶𝗸𝗲: “Optimization bhi karunga… confuse bhi karunga” 𝗥𝗲𝗮𝗹-𝘄𝗼𝗿𝗹𝗱 𝗹𝗲𝘀𝘀𝗼𝗻: If you don’t understand 𝚒𝚜 vs ==, bugs will find you 📌 𝗗𝗮𝘆 𝟭𝟯 𝗰𝗼𝗺𝗶𝗻𝗴 𝘁𝗼𝗺𝗼𝗿𝗿𝗼𝘄: 𝘈 𝘰𝘯𝘦-𝘭𝘪𝘯𝘦 𝘭𝘪𝘴𝘵 𝘵𝘩𝘢𝘵 𝘤𝘳𝘦𝘢𝘵𝘦𝘴 𝘢 𝘉𝘐𝘎 𝘩𝘪𝘥𝘥𝘦𝘯 𝘣𝘶𝘨 𝘍𝘰𝘭𝘭𝘰𝘸 𝘧𝘰𝘳 𝘮𝘰𝘳𝘦 “𝘺𝘦𝘩 𝘬𝘺𝘢 𝘤𝘩𝘢𝘭 𝘳𝘢𝘩𝘢 𝘩𝘢𝘪 𝘗𝘺𝘵𝘩𝘰𝘯?” 𝘮𝘰𝘮𝘦𝘯𝘵𝘴 😄 #Python #Programming #Developers #Coding #AI #DataScience #LearnPython
To view or add a comment, sign in
-
-
Day-5 Python + AI: Role of Data Types in Intelligent Systems Data types are essential in Python, especially in AI, where data is the core of every model. Proper use of data types helps in efficient processing and better predictions. Common Data Types in Python for AI - int, float → Numerical data - list, tuple → Data collections - dict → Structured data (key-value) - NumPy array → High-performance computations Concept Image Raw Data → (List / Array) → Processing (AI Model) → Output (Prediction) Example Program import numpy as np # Different data types numbers = [1, 2, 3, 4] # list array_data = np.array(numbers) # numpy array # Simple AI-like processing prediction = array_data * 2 print("Input Data:", array_data) print("Predicted Output:", prediction) Benefits of Using AI with Python - Efficient handling of different data types - Faster computation with optimized libraries - Easy model building and testing - Scalable for real-world AI applications Understanding data types is the first step toward building powerful AI solutions with Python. #Python #AI #MachineLearning #DataScience #Programming
To view or add a comment, sign in
More from this author
Explore content categories
- Career
- Productivity
- Finance
- Soft Skills & Emotional Intelligence
- Project Management
- Education
- Technology
- Leadership
- Ecommerce
- User Experience
- Recruitment & HR
- Customer Experience
- Real Estate
- Marketing
- Sales
- Retail & Merchandising
- Science
- Supply Chain Management
- Future Of Work
- Consulting
- Writing
- Economics
- Artificial Intelligence
- Employee Experience
- Workplace Trends
- Fundraising
- Networking
- Corporate Social Responsibility
- Negotiation
- Communication
- Engineering
- Hospitality & Tourism
- Business Strategy
- Change Management
- Organizational Culture
- Design
- Innovation
- Event Planning
- Training & Development