Learning Docker for DevOps engineer 🐳 As someone transitioning into DevOps, I knew I needed to really understand containerization—not just follow tutorials, but build something real. Project: https://lnkd.in/dKuRrRrm 🎯 What I Built: ✅ Multi-container app (Java + Node.js + PostgreSQL + Nexus) ✅ Docker Compose orchestration ✅ Multi-stage builds (800MB → 250MB!) ✅ Custom networks for service isolation ✅ Persistent volumes (learned this the hard way) ✅ Deployed to DigitalOcean droplets 💡 Concepts That Clicked: 🔹 Containers ≠ VMs Completely different paradigm. VMs virtualize hardware. Containers virtualize the OS. 🔹 Multi-stage builds Build dependencies don't belong in production images. My Java app dropped from 800MB to 250MB. 🔹 Docker networks Services discover each other by name. Java app reaches Nexus at `http://nexus:8081`. No IP configs needed. 🔹 Volumes save lives Lost my entire Nexus repository once when I restarted a container. Volumes = data that survives. 📚 Learning Journey: Week 1: Breaking everything "Why does my container exit immediately?" "Where's my database data?" "How do containers communicate?" Week 2: Everything clicks Multi-stage builds, networks, volumes—it all makes sense now. 🛠️ Tech Stack: 🐳 Docker & Docker Compose ☕ Java (Maven) 🟢 Node.js 🐘 PostgreSQL 📦 Nexus Repository 🔧 Nginx ☁️ DigitalOcean 🎓 Skills Gained: - Writing efficient Dockerfiles - Orchestrating multi-container apps - Managing persistent data - Container networking - Cloud deployment (DigitalOcean) - Debugging containerized apps 📖 Project Includes: ✓ Documented Dockerfiles (with WHY, not just WHAT) ✓ Docker Compose setup ✓ Volume & networking examples ✓ DigitalOcean deployment guide ✓ Mistakes I made + fixes ✓ Security basics 💭 Real Talk: This is a learning project, not production-ready. But it gave me hands-on experience with Docker concepts that matter in DevOps. Learning by building beats following tutorials every time. 🎯 Next Steps: - Kubernetes orchestration - CI/CD with Jenkins - Terraform for IaC - Monitoring setup For anyone learning DevOps: build something, break it, fix it, repeat. That's how concepts stick. Check it out: https://lnkd.in/dKuRrRrm Fellow learners: What project made Docker click for you? 👇 #DevOps #Docker #LearningInPublic #Containerization #CloudEngineering #CareerTransition
Docker DevOps Project: Multi-Container App with Java, Node.js, PostgreSQL, and Nexus
More Relevant Posts
-
𝑰 𝑱𝒖𝒔𝒕 𝑩𝒖𝒊𝒍𝒕 𝑺𝒐𝒎𝒆𝒕𝒉𝒊𝒏𝒈 𝑰 𝑾𝒊𝒔𝒉 𝑰 𝑯𝒂𝒅 5 𝒀𝒆𝒂𝒓𝒔 𝑨𝒈𝒐 Starting today, I am sharing 20𝑫𝒂𝒚𝒔𝑶𝒇𝑫𝒐𝒄𝒌𝒆𝒓, a completely free Docker learning challenge on GitHub. Why? Learning Docker was confusing for me: 🎥 Long YouTube videos that wasted time 📚 Articles jumping around with no clear path 💸 Expensive courses that became outdated ❌ Nobody explained why things matter, just how So I created something different. What’s Inside: 𝐖𝐞𝐞𝐤 1 (𝐃𝐚𝐲𝐬 1-7): 𝐁𝐚𝐬𝐢𝐜𝐬 What is Docker & how do images work Write your first Dockerfile Run & manage containers Connect containers Save data with Volumes Use Docker Compose 𝐖𝐞𝐞𝐤 2 (𝐃𝐚𝐲𝐬 8-14): 𝐆𝐞𝐭 𝐛𝐞𝐭𝐭𝐞𝐫 Advanced Compose Make images smaller & faster Best practices & debugging Security tips (Day 14 is detailed!) 𝐖𝐞𝐞𝐤 3 (𝐃𝐚𝐲𝐬 15-20): 𝐆𝐨 𝐩𝐫𝐨𝐟𝐞𝐬𝐬𝐢𝐨𝐧𝐚𝐥 Monitor containers Connect Docker to CI/CD Orchestration & microservices Deploy to production Build a real project 𝐖𝐡𝐲 𝐈𝐭’𝐬 𝐃𝐢𝐟𝐟𝐞𝐫𝐞𝐧𝐭: ✅ Real examples & copy-paste code ✅ Clear, simple explanations ✅ Hands-on exercises with solutions ✅ Organized day by day ✅ Completely free & always updated 𝐖𝐡𝐲 𝐘𝐨𝐮 𝐒𝐡𝐨𝐮𝐥𝐝 𝐂𝐚𝐫𝐞: 🏢 80% of tech companies use Docker 💼 DevOps jobs almost always require it 💰 Docker skills can boost salary 15-25% 📈 Essential for modern apps & microservices 𝐇𝐨𝐰 𝐈𝐭 𝐖𝐨𝐫𝐤𝐬: Each day: folder + objectives + exercises + solutions Time: 30 min – 2 hours per day 𝓝𝓮𝔁𝓽 𝓢𝓽𝓮𝓹𝓼: ⭐ 𝑺𝒕𝒂𝒓 𝒕𝒉𝒆 𝒓𝒆𝒑𝒐 𝒐𝒏 𝑮𝒊𝒕𝑯𝒖𝒃 📖 𝑹𝒆𝒂𝒅 𝑫𝒂𝒚 1 - 𝑫𝒐𝒄𝒌𝒆𝒓 𝑩𝒂𝒔𝒊𝒄𝒔 💻 𝑫𝒐 𝒆𝒙𝒆𝒓𝒄𝒊𝒔𝒆𝒔 & ✅ 𝒄𝒉𝒆𝒄𝒌 𝒔𝒐𝒍𝒖𝒕𝒊𝒐𝒏𝒔 📤 𝑺𝒉𝒂𝒓𝒆 𝒑𝒓𝒐𝒈𝒓𝒆𝒔𝒔 𝑾𝒉𝒐 𝑰𝒔 𝑻𝒉𝒊𝒔 𝑭𝒐𝒓? 𝑩𝒆𝒈𝒊𝒏𝒏𝒆𝒓𝒔, 𝒃𝒂𝒄𝒌𝒆𝒏𝒅 𝒅𝒆𝒗𝒆𝒍𝒐𝒑𝒆𝒓𝒔, 𝒄𝒂𝒓𝒆𝒆𝒓 𝒄𝒉𝒂𝒏𝒈𝒆𝒓𝒔 𝑨𝒏𝒚𝒐𝒏𝒆 𝒘𝒂𝒏𝒕𝒊𝒏𝒈 𝒃𝒆𝒕𝒕𝒆𝒓-𝒑𝒂𝒚𝒊𝒏𝒈 𝒋𝒐𝒃𝒔 𝑷𝒆𝒐𝒑𝒍𝒆 𝒘𝒉𝒐 𝒕𝒓𝒊𝒆𝒅 𝑫𝒐𝒄𝒌𝒆𝒓 𝒃𝒖𝒕 𝒈𝒐𝒕 𝒍𝒐𝒔𝒕 𝐃𝐚𝐲 1 𝐢𝐬 𝐥𝐢𝐯𝐞 𝐓𝐎𝐃𝐀𝐘. 𝐓𝐨𝐦𝐨𝐫𝐫𝐨𝐰, 𝐃𝐚𝐲 2 𝐝𝐫𝐨𝐩𝐬, 𝐚𝐧𝐝 𝐢𝐭 𝐜𝐨𝐧𝐭𝐢𝐧𝐮𝐞𝐬 𝐟𝐨𝐫 20 𝐝𝐚𝐲𝐬. 🐳 𝐋𝐞𝐚𝐫𝐧 𝐃𝐨𝐜𝐤𝐞𝐫 𝐩𝐫𝐨𝐩𝐞𝐫𝐥𝐲. 𝐁𝐮𝐢𝐥𝐝 𝐫𝐞𝐚𝐥 𝐬𝐤𝐢𝐥𝐥𝐬. 𝐁𝐨𝐨𝐬𝐭 𝐲𝐨𝐮𝐫 𝐜𝐚𝐫𝐞𝐞𝐫. GitHub https://lnkd.in/dtVn3ieP #Docker #DevOps #FreeLearning #OpenSource #CareerGrowth #Tech
To view or add a comment, sign in
-
𝐃𝐚𝐲 3: 𝐃𝐨𝐜𝐤𝐞𝐫𝐟𝐢𝐥𝐞𝐬 𝐄𝐱𝐩𝐥𝐚𝐢𝐧𝐞𝐝 𝐋𝐢𝐤𝐞 𝐍𝐞𝐯𝐞𝐫 𝐁𝐞𝐟𝐨𝐫𝐞 – 𝐁𝐮𝐢𝐥𝐝, 𝐎𝐩𝐭𝐢𝐦𝐢𝐳𝐞 𝐰𝐢𝐭𝐡 𝐌𝐮𝐥𝐭𝐢-𝐒𝐭𝐚𝐠𝐞 𝐁𝐮𝐢𝐥𝐝𝐬 & 𝐑𝐞𝐝𝐮𝐜𝐞 𝐈𝐦𝐚𝐠𝐞 𝐒𝐢𝐳𝐞 𝐛𝐲 𝐔𝐩 𝐭𝐨 70% 🐳 𝑶𝒏 𝑫𝒂𝒚 1, 𝒘𝒆 𝒓𝒂𝒏 𝒄𝒐𝒏𝒕𝒂𝒊𝒏𝒆𝒓𝒔. 𝑶𝒏 𝑫𝒂𝒚 2, 𝒘𝒆 𝒖𝒏𝒅𝒆𝒓𝒔𝒕𝒐𝒐𝒅 𝒊𝒎𝒂𝒈𝒆𝒔. But today… Everything changes. 👉 What if the exact image you need doesn’t exist? 👉 What if you want full control over your environment? That’s where Dockerfiles come in. In Day 3 of #20DaysOfDocker, we stop relying on others and start building our own images from scratch. 👉 What you’ll learn: What Dockerfiles really are (more than just a config file) All essential instructions (FROM, RUN, COPY, CMD, etc.) How to build custom images step by step Multi-stage builds (build big → ship small ) Best practices used in real production systems Optimization techniques to reduce image size dramatically 💡 The big insight: A Dockerfile is a recipe for consistency. Same code + same Dockerfile = same environment anywhere. No more “it works on my machine.” ❌ 𝐇𝐚𝐧𝐝𝐬-𝐨𝐧 (𝐫𝐞𝐚𝐥 𝐥𝐞𝐚𝐫𝐧𝐢𝐧𝐠): Write your first Dockerfile Build your own image Optimize it step by step 𝐔𝐬𝐞 𝐦𝐮𝐥𝐭𝐢-𝐬𝐭𝐚𝐠𝐞 𝐛𝐮𝐢𝐥𝐝𝐬 𝐭𝐨 𝐜𝐮𝐭 𝐬𝐢𝐳𝐞 𝐛𝐲 𝐮𝐩 𝐭𝐨 70% ⚡ 𝐖𝐡𝐲 𝐭𝐡𝐢𝐬 𝐦𝐚𝐭𝐭𝐞𝐫𝐬: Smaller images = faster deployments Optimized builds = lower costs Clean structure = easier maintenance Real skill = real DevOps growth 𝐁𝐲 𝐭𝐡𝐞 𝐞𝐧𝐝 𝐨𝐟 𝐃𝐚𝐲 3: 𝐘𝐨𝐮’𝐫𝐞 𝐧𝐨𝐭 𝐣𝐮𝐬𝐭 𝐫𝐮𝐧𝐧𝐢𝐧𝐠 𝐜𝐨𝐧𝐭𝐚𝐢𝐧𝐞𝐫𝐬… 𝐘𝐨𝐮’𝐫𝐞 𝐞𝐧𝐠𝐢𝐧𝐞𝐞𝐫𝐢𝐧𝐠 𝐭𝐡𝐞𝐦. 👉 𝐒𝐭𝐚𝐫𝐭 𝐃𝐚𝐲 3 𝐡𝐞𝐫𝐞: https://lnkd.in/dtVn3ieP Tomorrow, we go even deeper. Let’s keep building. 🐳 #Docker #DevOps #LearningInPublic #OpenSource #BackendDevelopment #CloudComputing #SoftwareEngineering #TechCommunity
To view or add a comment, sign in
-
🚀 Just finished the Docker course on Boot.dev! 🚀 I’m excited to share that I’ve learned the fundamentals of Docker—a key technology in modern DevOps and CI/CD pipelines. Docker makes it simple and fast to deploy new versions of code by packaging applications and their dependencies into preconfigured environments. This not only speeds up deployment, but also reduces overhead and eliminates the “it works on my machine” problem. Docker is a core part of the CI/CD (Continuous Integration/Continuous Deployment) process, enabling teams to deliver software quickly and reliably. Here’s a high-level overview of a typical CI/CD deployment process: The Deployment Process: 1. The developer (you) writes some new code 2. The developer commits the code to Git 3. The developer pushes a new branch to GitHub 4. The developer opens a pull request to the main branch 5. A teammate reviews the PR and approves it (if it looks good) 6. The developer merges the pull request 7. Upon merging, an automated script, perhaps a GitHub action, is started 8. The script builds the code (if it's a compiled language) 9. The script builds a new docker image with the latest program 10. The script pushes the new image to Docker Hub 11. The server that runs the containers, perhaps a Kubernetes cluster, is told there is a new version 12. The k8s cluster pulls down the latest image 13. The k8s cluster shuts down old containers as it spins up new containers of the latest image This process ensures that new features and fixes can be delivered to users quickly, safely, and consistently. image credit: Boot.dev Docker course #docker #cicd #devops #softwaredevelopment #bootdev #learning
To view or add a comment, sign in
-
-
Day 6 of #DevOpsJourneyToHired 🐳 Today's Focus: Docker Fundamentals + GitOps Project Roadmap 🐳 Docker Deep Dive: Started with container basics - the backbone of modern DevOps • What containers are and why they matter • Docker architecture: images, containers, registries • Writing Dockerfiles: FROM, RUN, COPY, CMD, EXPOSE • Container lifecycle: build, run, stop, remove • Docker networking and volumes basics 💡 Why Docker first? Before building a GitOps platform, I need to master containers. Can't orchestrate what you don't understand. ArgoCD deploys containerized apps - so Docker knowledge is non-negotiable. 📋 GitOps IDP Project Roadmap Created: **Phase 1: Foundation (Week 1-2)** → Docker mastery + basic Kubernetes → Set up local K8s cluster (Minikube/Kind) → Create sample microservices **Phase 2: GitOps Core (Week 3-4)** → ArgoCD installation and configuration → Git repository structure for IaC → Automated sync and deployment workflows **Phase 3: Developer Portal (Week 5-6)** → Backstage setup and customization → Service catalog with templates → Documentation integration **Phase 4: Enterprise Features (Week 7-8)** → Multi-environment support (dev/staging/prod) → RBAC and security policies → Monitoring and observability dashboard 🔄 Revision Work: Reviewed Days 1-5 concepts: • Linux fundamentals ✓ • Networking basics ✓ • AWS services ✓ • Shell scripting concepts ✓ 📊 Progress Update: Learning streak: 6 days ✅ Docker exercises completed: 5 Project roadmap: Defined and documented Applications sent: 17 total 🎯 Tomorrow: Hands-on Docker practice - building and deploying containers What's your Docker learning journey been like? #DevOps #Docker #Containers #GitOps #ProjectPlanning #LearningInPublic #ArgoCD
To view or add a comment, sign in
-
-
Day 12/30 – Docker Learning Series Docker Exec and Interactive Containers Today I explored how to interact with running containers, which is an essential skill for debugging and managing applications in Docker. Running a container is not always enough. In real-world scenarios, we often need to go inside a container to inspect files, check processes, or troubleshoot issues. --- What is docker exec? The docker exec command is used to run commands inside a running container. Basic syntax: docker exec <container_id> <command> --- Open Interactive Terminal Inside a Container docker exec -it <container_id> /bin/bash Explanation: -i → Interactive mode -t → Allocates a terminal /bin/bash → Opens a shell inside the container If bash is not available (like in Alpine images), use: docker exec -it <container_id> /bin/sh --- Example Run an Nginx container: docker run -d --name mynginx nginx Enter the container: docker exec -it mynginx /bin/bash Now you are inside the container and can run Linux commands. --- Run One-Time Commands Inside Container docker exec mynginx ls /usr/share/nginx/html This runs a command without opening a full terminal. --- What are Interactive Containers? Interactive containers allow you to interact directly with the container’s shell. Example: docker run -it ubuntu /bin/bash This starts a container and immediately opens a terminal. --- Exit from Container Type: exit This will close the container session. --- Key Takeaways • docker exec allows access to running containers • Useful for debugging and inspecting applications • Interactive mode helps simulate real server environments • Essential skill for troubleshooting in DevOps Being able to enter and inspect containers is critical when working with production systems. --- Day 12/30 – Docker Learning Series Next: Dockerfile Introduction and Writing Your First Dockerfile #Docker #DevOps #Containerization #CloudComputing #CICD #Infrastructure #SRE #LearningInPublic #TechLearning #NetworkToDevOps
To view or add a comment, sign in
-
⭐ Most platform engineers I know use Cursor for autocomplete. That's like using a excavator to dig a hole with a teaspoon attachment. I spent the last few weeks going deep on Cursor Agent — not the tab-complete, the actual agent mode — specifically for infrastructure and DevOps work. What I found changed how I think about the tool entirely. The agent doesn't just edit files. It: → Queries your live Kubernetes cluster before making a change → Catches open PRs that would conflict with what you're about to do → Investigates a 5xx incident across GitHub, kubectl, and your deploy history — in one conversation → Runs terraform validate, reads the error, fixes it, runs again — without you typing a command But the part nobody talks about: Out of the box, it's generic. It doesn't know your naming conventions, your module patterns, your "never touch this file" rules. Once you configure it properly — 6 files, maybe 2 hours of setup — it's a different tool entirely. I wrote the full breakdown. What MCP actually is, how the agent calls tools under the hood, every config file your team needs to replicate this, and 6 real use cases with exact prompts. If you work in platform or DevOps, this one's worth the read. Part 1 (link in the comment) and Part 2: https://lnkd.in/gpXdFjRU #DevOps #PlatformEngineering #Kubernetes #Terraform #CursorAI #AITools #SRE
To view or add a comment, sign in
-
Day 5 of #30DaysOfDevOps — Docker Basics Docker is one of the most important tools in DevOps. It ensures your app runs the same way on your laptop, in staging, and in production. No more "it works on my machine." 1. Why Docker? Docker packages your app and everything it needs into a single container that runs consistently anywhere. Containers vs VMs: - VMs include a full OS — heavy, slow to start - Containers share the host OS kernel — lightweight, start in seconds 2. Core Concepts Image — read-only template with your app and dependencies Container — a running instance of an image Dockerfile — instructions to build an image Docker Hub — public registry to store and share images 3. Essential Commands Run a container: docker run -d -p 8080:80 nginx List running containers: docker ps Stop and remove: docker stop 3f2a1b docker rm 3f2a1b Shell into a running container: docker exec -it 3f2a1b bash 4. Writing a Dockerfile FROM node:20-alpine WORKDIR /app COPY package*.json ./ RUN npm ci --only=production COPY . . EXPOSE 4000 CMD ["node", "server.js"] Build and run: docker build -t my-app:v1.0 . docker run -d -p 4000:4000 my-app:v1.0 5. Push to Docker Hub docker tag my-app:v1.0 yourname/my-app:v1.0 docker login docker push yourname/my-app:v1.0 6. Optimization Tips Use alpine images — 5x smaller than full OS images Add .dockerignore to exclude node_modules and .git Copy package files before source code to maximize layer caching 7. Challenges for Today 1. Install Docker and verify with: docker run hello-world 2. Run an nginx container on port 8080 and open it in your browser. 3. Write a Dockerfile for a Python or Node.js app and build it. 4. Tag your image and push it to Docker Hub. 5. Shell into a running container and explore the filesystem. 6. Add a .dockerignore and observe the build context size difference. Drop your Docker Hub image link in the comments. #DevOps #Docker #Containers #Dockerfile #30DaysOfDevOps #LearningInPublic #DevOpsEngineer #CloudComputing
To view or add a comment, sign in
-
New Blog post on my Kubernetes journey! Added a self-signed TLS cert to Grafana on my k3s Raspberry Pi cluster - fully GitOps, encrypted with SOPS + age, delivered through FluxCD. The writeup covers the full flow: - Generate cert with openssl - Render the secret manifest with kubectl --dry-run - Encrypt in place with SOPS (only the data fields - metadata stays readable for sane diffs) - Push to Git, let Flux handle the rest Of course it didn't go perfectly. Two bugs hit back to back: 1. A YAML indentation error that took down the entire kustomize build. Two extra spaces. TWO. I've been making a lot of syntax errors lately...lol. 2. A missing Kustomization in my clusters/ directory - the secret was sitting in Git the whole time, Flux just had no idea the path existed. A good reminder that Flux doesn't auto-discover directories; you have to explicitly register every path you want it to watch. Both are documented in the post because the bugs are honestly more useful than the happy path. Post here: https://lnkd.in/e4TPRMan The blog is a personal journal of GitOps and homelab experiments - notes, writeups, and lessons from running a single-node k3s cluster on a Raspberry Pi 4. Built with Astro, open on GitHub at https://lnkd.in/e96m_QJt. #GitOps #Kubernetes #Homelab #FluxCD #DevOps #k3s #SOPS #TLS #SelfHosted
To view or add a comment, sign in
-
🐳 Top Docker Commands Every Developer Should Know If you're working with Docker, mastering a few core commands can make your workflow faster, cleaner, and more efficient. Here are some essential Docker commands every developer should know: 🔹 1. Check Docker Version docker --version 🔹 2. Pull an Image from Docker Hub docker pull nginx 🔹 3. List Images docker images 🔹 4. Run a Container docker run -d -p 3000:3000 node-app 🔹 5. List Running Containers docker ps 🔹 6. List All Containers (including stopped) docker ps -a 🔹 7. Stop a Container docker stop <container_id> 🔹 8. Remove a Container docker rm <container_id> 🔹 9. Remove an Image docker rmi <image_id> 🔹 10. View Logs docker logs <container_id> 🔹 11. Execute Command Inside Container docker exec -it <container_id> bash 🔹 12. Build an Image docker build -t my-app . 🔹 13. Docker Compose Up docker-compose up -d 🔹 14. Docker Compose Down docker-compose down 💡 Pro Tip You don’t need to memorize everything — but knowing these commands can cover 80% of real-world Docker use cases. Mastering Docker CLI is a big step toward becoming a DevOps-ready developer 🚀 #Docker #DevOps #Containerization #WebDevelopment #CloudComputing #CICD #SoftwareEngineering #BackendDevelopment #TechSkills #Programming
To view or add a comment, sign in
-
-
🚀 Week 5 of My DevOps Learning Journey This week was a major step forward - I moved from basic CI concepts to building a complete automated pipeline using Jenkins and Docker. 📅 Week 5 – What I Learned 🔹 Jenkins Pipelines (Pipeline as Code) • Created and configured Pipeline jobs in Jenkins • Learned how Jenkinsfile defines the entire workflow • Understood stages like Build and Run in a structured pipeline 🔹 GitHub Integration & Automation • Connected Jenkins with GitHub repositories • Configured automatic build triggers (Poll SCM) • Achieved real Continuous Integration: every code change triggers a build 🔹 Running Applications in CI • Executed Python and Java applications through Jenkins pipelines • Used shell commands inside pipelines to automate tasks • Understood how Jenkins interacts with the system environment 🔹 Docker Integration (🚀 Big Milestone) • Created a Dockerfile for a Java application • Built Docker images inside Jenkins • Ran containers as part of the pipeline • Successfully executed application inside a container This week really helped me understand how modern applications are built, packaged, and executed in a CI/CD workflow. 💡 Key Takeaways: • CI/CD is not just theory — it's all about automation in action • Debugging (PATH issues, tools, environments) is a core DevOps skill • Docker + Jenkins together form a powerful automation pipeline 📌 Week 5 complete. Next up: pushing Docker images to Docker Hub and moving towards real deployment. 🔗 GitHub Repository: End-to-end CI/CD pipeline using Jenkins, Docker, and Java 👉 https://lnkd.in/gJihH_g5 #DevOps #CICD #Jenkins #Docker #Automation #GitHub #Java #CloudComputing #LearningJourney #TechGrowth End-to-end CI/CD pipeline: GitHub → Jenkins → Docker → Java App 🚀
To view or add a comment, sign in
-
Explore related topics
Explore content categories
- Career
- Productivity
- Finance
- Soft Skills & Emotional Intelligence
- Project Management
- Education
- Technology
- Leadership
- Ecommerce
- User Experience
- Recruitment & HR
- Customer Experience
- Real Estate
- Marketing
- Sales
- Retail & Merchandising
- Science
- Supply Chain Management
- Future Of Work
- Consulting
- Writing
- Economics
- Artificial Intelligence
- Employee Experience
- Workplace Trends
- Fundraising
- Networking
- Corporate Social Responsibility
- Negotiation
- Communication
- Engineering
- Hospitality & Tourism
- Business Strategy
- Change Management
- Organizational Culture
- Design
- Innovation
- Event Planning
- Training & Development
👏👏👏