How API Fetch Works in JavaScript Today I explored fetching data from APIs in JavaScript. Here’s the core idea: const response = await fetch('https://lnkd.in/gtuyqVPs'); const data = await response.json(); console.log(data); Key takeaways: fetch() → request data from an API async/await → makes async code readable response.json() → converts data to usable objects Fun tip: I even visualized this in a colorful notebook-style page with doodles, icons, and notes. It really helps to understand the flow of data! #JavaScript #WebDev #API #CodingTips #LearningByDoing #TechNotes
JavaScript API Fetch Explained
More Relevant Posts
-
Sharing a fresh perspective. Maps and Sets in JavaScript: A Deep Dive JavaScript provides several ways to store and manage data, with objects and arrays being the most commonly used structures. However, when dealing with unique values or key-value pairs with better efficiency, Maps and Sets come to the rescue! Read → https://lnkd.in/dvP2-4nH #CodingJourney #DevelopersLife #TechSkills #Blog
To view or add a comment, sign in
-
🚀 𝗝𝗮𝘃𝗮𝗦𝗰𝗿𝗶𝗽𝘁 𝗧𝗶𝗽: 𝗝𝗦𝗢𝗡 vs 𝘀𝘁𝗿𝘂𝗰𝘁𝘂𝗿𝗲𝗱𝗖𝗹𝗼𝗻𝗲() Most developers copy objects like this 👇 const copy = JSON.parse(JSON.stringify(obj)); 👉 Works… but only for simple data ❌ Removes undefined ❌ Converts Date → string ❌ Removes functions ❌ Breaks for Map, Set, circular refs ✅ 𝗠𝗼𝗱𝗲𝗿𝗻 𝘄𝗮𝘆: const copy = structuredClone(obj); ✔️ Real deep clone ✔️ Keeps Date, Map, Set ✔️ Handles circular references ✔️ Clean & safe 💡 𝗥𝘂𝗹𝗲 𝗼𝗳 𝘁𝗵𝘂𝗺𝗯: Use structuredClone() in modern apps Use JSON method only for basic data #JavaScript #Frontend #WebDevelopment #CodingTips 🚀
To view or add a comment, sign in
-
Maps and Sets in JavaScript: A Deep Dive JavaScript provides several ways to store and manage data, with objects and arrays being the most commonly used structures. However, when dealing with unique values or key-value pairs with better effi Read more → https://lnkd.in/dvP2-4nH #TheCampusCoders #Tech #Developers #WebDev
To view or add a comment, sign in
-
⚡ 1-Minute JavaScript Remove duplicate values from an array using a single line. Utility function :- const unique = (arr) => [...new Set(arr)]; Usage :- unique([1, 2, 2, 3, 3, 4]); // [1, 2, 3, 4] 💡 Why this works: Set is a built-in JavaScript data structure that only stores unique values. When you pass an array into a Set, duplicates are automatically removed. Then the spread operator (...) converts it back into an array. 🔎 Practical use cases: 🧹 Cleaning duplicate API data 📊 Preparing datasets for charts 🧾 Removing repeated IDs before processing ⚡ Quick array normalization Small trick. Cleaner data. #JavaScript #FrontendDevelopment #CleanCode #WebDevelopment #DevTips
To view or add a comment, sign in
-
-
This JavaScript behavior quietly breaks real applications. [] + [] Output: "" Looks harmless. But here’s what’s happening: JavaScript converts arrays into strings. [].toString() → "" So it becomes: "" + "" → "" Now imagine this with: - API responses - user inputs - dynamic data No error. No warning. Just incorrect logic. These bugs don’t show up early. They show up when your data stops being predictable.
To view or add a comment, sign in
-
Day 87 of me reading random and basic but important dev topicsss....... Today I read about the Blobs in JavaScript As a developer, we deal with file uploads or downloads in the browser. But what happens under the hood and how JS handles binary data? While ArrayBuffer is part of the core ECMA standard, the browser’s File API gives us a higher-level abstraction: The Blob (Binary Large Object). What exactly is a Blob? Unlike a raw ArrayBuffer, a Blob represents binary data with type. It consists of an optional string type (usually a MIME-type) and blobParts (a sequence of strings, BufferSources, or even other Blobs). Construction We construct them by passing an array of parts and an options object: let blob = new Blob( [new Uint8Array([72, 101, 108, 108, 111]), ' ', 'world'], { type: 'text/plain', endings: 'native' } ); Immutability Just like JavaScript strings, Blobs are entirely immutable. We cannot directly edit the data inside a Blob. However, we can create new Blobs from existing ones using the .slice() method: blob.slice([byteStart], [byteEnd], [contentType]); This allows us to chop up files for chunked uploads or assemble new files in memory without altering the original binary data. Keep Learning!!!!! #JavaScript #WebDevelopment #FrontendDev #SoftwareEngineering #WebAPIs
To view or add a comment, sign in
-
-
Today I learned something interesting about fetching data in JavaScript. When we use fetch(), many beginners notice that it usually has two .then() methods. At first it looks unnecessary, but there is a clear reason behind it. The first .then() handles the HTTP response returned by fetch(). The second .then() is used to convert that response into actual JSON data using response.json(). Example: fetch("API_URL") .then(response => response.json()) .then(data => { console.log(data); }); While revising this concept, I also learned a cleaner and more modern approach using async/await, which makes asynchronous code easier to read and understand. async function getData() { const response = await fetch("API_URL"); const data = await response.json(); console.log(data); } Both approaches work the same way internally using Promises, but async/await makes the flow feel more natural. Small concepts like these help in writing cleaner and more maintainable JavaScript code. #JavaScript #WebDevelopment #AsyncJavaScript #FetchAPI #CodingJourney #LearningInPublic
To view or add a comment, sign in
-
-
Day 85 of me reading random and basic but important dev topicsss..... Today I read about bridging the gap between Binary Data and Text in JavaScript (TextDecoder)..... To translate raw bytes into readable text, standard string manipulation won’t cut it. We need TextDecoder. TextDecoder is a built-in browser and Node.js object that reads a buffer and converts it into an actual JavaScript string based on a specified encoding (UTF-8 by default, but it supports windows-1251, big5, etc.). let uint8Array = new Uint8Array([72, 101, 108, 108, 111]); let decoder = new TextDecoder(); // defaults to utf-8 console.log(decoder.decode(uint8Array)); // "Hello" Some tricks which I read were....... 1. Zero-Copy Subarray Decoding: What if our string is buried inside a larger buffer (e.g. a custom network packet where the first byte is an ID and the string starts at byte 2)? Don’t slice the array! slice() copies memory. Instead, use a subarray to create a new "view" over the same memory block and decode that. let packet = new Uint8Array([0, 72, 101, 108, 108, 111, 0]); let stringView = packet.subarray(1, -1); // No memory copied! console.log(new TextDecoder().decode(stringView)); // "Hello" 2. Handling Streams (stream: true): When receiving data in chunks (like from the Streams API), a multi-byte character (like an emoji or an accented letter) might get split in half across two different network chunks. If we decode them separately, we will get corrupted garbage characters (``). By passing { stream: true } to the .decode() method, the TextDecoder memorizes the "unfinished" bytes and seamlessly combines them with the next chunk. Keep Learning!!!!! #JavaScript #WebDevelopment #Performance #DataStreams #SoftwareEngineering
To view or add a comment, sign in
-
-
✨ 𝗗𝗮𝘆 𝟭𝟵 𝗼𝗳 𝗠𝘆 𝗝𝗮𝘃𝗮𝗦𝗰𝗿𝗶𝗽𝘁 𝗝𝗼𝘂𝗿𝗻𝗲𝘆 🚀 Today I explored 𝗣𝗿𝗼𝗺𝗶𝘀𝗲𝘀 𝗶𝗻 𝗝𝗮𝘃𝗮𝗦𝗰𝗿𝗶𝗽𝘁 and clarified 𝗝𝗦𝗢𝗡 𝘃𝘀 𝗝𝗮𝘃𝗮𝗦𝗰𝗿𝗶𝗽𝘁 𝗢𝗯𝗷𝗲𝗰𝘁𝘀. 🔹𝗣𝗿𝗼𝗺𝗶𝘀𝗲𝘀 – a modern way to handle asynchronous operations, making code cleaner and avoiding callback hell. Learned about 𝘁𝗵𝗲𝗻(), 𝗰𝗮𝘁𝗰𝗵(), and 𝗰𝗵𝗮𝗶𝗻𝗶𝗻𝗴 𝗽𝗿𝗼𝗺𝗶𝘀𝗲𝘀. 🔹𝗝𝗦𝗢𝗡 𝘃𝘀 𝗝𝗦 𝗢𝗯𝗷𝗲𝗰𝘁𝘀 – JSON is a string format for data exchange ("key": "value"), whereas JS objects are live objects in memory that you can manipulate in code. Converting between them using 𝗝𝗦𝗢𝗡.𝘀𝘁𝗿𝗶𝗻𝗴𝗶𝗳𝘆() 𝗮𝗻𝗱 𝗝𝗦𝗢𝗡.𝗽𝗮𝗿𝘀𝗲() is essential for working with APIs. Understanding Promises and JSON made me realize how asynchronous programming and data exchange actually work in real-world web apps. #JavaScript #100DaysOfCode #WebDevelopment #LearningJourney #FrontendDevelopment #Promises #JSON
To view or add a comment, sign in
-
-
𝗧𝗵𝗲 𝗜𝗻𝘀𝗶𝗱𝗲 𝗦𝘁𝗼𝗿𝗶𝗲𝘀 𝗼𝗳 𝗡𝗼𝗱𝗲.𝗷𝘀 Node.js is a runtime environment that lets you run JavaScript outside the browser. It was created by Ryan Dahl in 2009. He extracted the Google V8 engine from the browser and combined it with a powerful library called libuv. Node.js is made up of two major components: - V8 Engine: converts JavaScript code into machine language - libuv: helps Node.js handle asynchronous tasks When you run a JavaScript file using Node.js, it creates a process that runs on a single thread. This thread is responsible for executing your JavaScript code. Here's how Node.js executes a program: - Loads any modules using require() - Registers event callbacks - Prepares internal resources - Creates a thread pool to handle CPU-intensive tasks The event loop is responsible for continuously checking for tasks and handling asynchronous operations. It has several phases: - Expired Timer Callbacks - I/O Polling - setImmediate Callbacks - Close Callbacks Node.js uses a thread pool to handle heavy tasks in the background. By default, this thread pool has 4 threads. You can increase or decrease the number of threads depending on your workload. Understanding how Node.js works internally gives you a clear mental model of how it works behind the scenes. Source: https://lnkd.in/gUEiDCAy
To view or add a comment, sign in
Explore content categories
- Career
- Productivity
- Finance
- Soft Skills & Emotional Intelligence
- Project Management
- Education
- Technology
- Leadership
- Ecommerce
- User Experience
- Recruitment & HR
- Customer Experience
- Real Estate
- Marketing
- Sales
- Retail & Merchandising
- Science
- Supply Chain Management
- Future Of Work
- Consulting
- Writing
- Economics
- Artificial Intelligence
- Employee Experience
- Workplace Trends
- Fundraising
- Networking
- Corporate Social Responsibility
- Negotiation
- Communication
- Engineering
- Hospitality & Tourism
- Business Strategy
- Change Management
- Organizational Culture
- Design
- Innovation
- Event Planning
- Training & Development