New on Foojay: Java Faceted Full-Text Search API Using MongoDB Atlas Search. Luke Thompson shows you how to build a faceted full-text search API with Java and MongoDB Atlas Search. The article covers: • Setting up MongoDB Atlas Search indexes • Implementing faceted search functionality • Building a REST API with Java • Handling search queries and filtering results Perfect for developers looking to add powerful search capabilities to their Java applications without the complexity of managing separate search infrastructure. Read the full article here: https://lnkd.in/emJPYwep #Java #MongoDB #SearchAPI #FullTextSearch #foojay
Java Full-Text Search API with MongoDB Atlas Search
More Relevant Posts
-
Written by Mike LaSpina published on Friends of OpenJDK (Foojay.io), learn CQRS in Java where he'll dive into separating reads and writes clealy. What you'll learn: - How the MongoDB Spring repository can be used to abstract MongoDB operations - Separating Reads and Writes in your application - How separating these can make schema design changes easier - Why you should avoid save() and saveAll() functions in Spring Read it here 👉 https://lnkd.in/gs8pnGpk #mongodb #java #nosql #database
To view or add a comment, sign in
-
🚀 Day 3 of My Advanced Java Journey – Mastering CRUD Operations in JDBC Today, I implemented one of the most important concepts in backend development — CRUD operations using JDBC. 🔹 What is CRUD? CRUD stands for: Create → Insert data Read → Fetch data Update → Modify existing data Delete → Remove data 🔹 1. Create (INSERT) Used to add records into the database. ✔️ Key concept: Using PreparedStatement for inserting values safely. String sql = "INSERT INTO employees(name, designation, salary) VALUES (?, ?, ?)"; PreparedStatement ps = conn.prepareStatement(sql); ps.setString(1, "Vamsi"); ps.setString(2, "Software Engineer"); ps.setDouble(3, 60000); ps.executeUpdate(); 🔹 2. Read (SELECT) Used to retrieve and display data. ✔️ Key concept: Using ResultSet to iterate through records. Statement stmt = conn.createStatement(); ResultSet rs = stmt.executeQuery("SELECT * FROM employees"); while(rs.next()){ int id = rs.getInt("id"); String name = rs.getString("name"); String designation = rs.getString("designation"); double salary = rs.getDouble("salary"); } 🔹 3. Update (UPDATE) Used to modify existing records. String sql = "UPDATE employees SET salary = ? WHERE id = ?"; PreparedStatement ps = conn.prepareStatement(sql); ps.setDouble(1, 65000); ps.setInt(2, 1); ps.executeUpdate(); 🔹 4. Delete (DELETE) Used to remove records from the database. String sql = "DELETE FROM employees WHERE id = ?"; PreparedStatement ps = conn.prepareStatement(sql); ps.setInt(1, 1); ps.executeUpdate(); 🔍 What I explored beyond the session Why PreparedStatement is preferred over Statement (prevents SQL Injection 🔐) Difference between executeQuery() and executeUpdate() Importance of handling exceptions (SQLException) Closing resources (Connection, Statement, ResultSet) to avoid memory leaks 💡 CRUD operations form the core of any real-world application, from simple apps to enterprise systems. 🙌 Special thanks to the amazing trainers at TAP Academy: kshitij kenganavar Sharath R MD SADIQUE Bibek Singh Hemanth Reddy Vamsi yadav Harshit T Ravi Magadum Somanna M G Rohit Ravinder TAP Academy 📌 Learning in public. Building consistency every day. #Java #AdvancedJava #JDBC #BackendDevelopment #LearningInPublic #VamsiLearns
To view or add a comment, sign in
-
-
Written by Luke Thompson - MongoDB Champion published on Friends of OpenJDK (Foojay.io), learn how to build a Java faceted full-text search API! In the tutorial, he'll walk through using a interesting dataset which showcases how you can effectively pair machine learning/AI-generated data with more traditional search to produce fast, cheap, repeatable, and intuitive search engines. Dive in here 👉 https://lnkd.in/gm6a2Y77 #mongodb #java #nosql #database #atlas
To view or add a comment, sign in
-
Day 18 —#100DaysJava today Java connected to a database for the first time. That felt real. ☕ For 17 days I have been writing Java code that runs and stops. Data lives for one second and disappears. Today I learned JDBC — and now data can actually be saved, fetched, updated, and deleted. Permanently. This is where Java starts feeling like a real backend language. --- What is JDBC? JDBC stands for Java Database Connectivity. It is the bridge between your Java code and a database like MySQL or PostgreSQL. Without JDBC — Java cannot talk to a database. With JDBC — Java can run SQL queries directly from your code. --- How it works — 5 simple steps every Java developer should know: Step 1 — Load the driver Tell Java which database you are connecting to. Class.forName("com.mysql.cj.jdbc.Driver"); Step 2 — Create a connection Give the database URL, username, and password. Connection con = DriverManager.getConnection(url, user, password); Step 3 — Create a statement Prepare the SQL query you want to run. Statement st = con.createStatement(); Step 4 — Execute the query Run SELECT, INSERT, UPDATE, or DELETE. ResultSet rs = st.executeQuery("SELECT * FROM users"); Step 5 — Close the connection Always close after use. Never leave it open. con.close(); --- The thing that surprised me — PreparedStatement vs Statement. Statement is simple but dangerous. If you put user input directly into a SQL query, a hacker can inject malicious SQL and destroy your database. This is called SQL Injection. PreparedStatement is safe. You use placeholders — ? — and Java handles the input safely. Every real application uses PreparedStatement. Never Statement with user input. PreparedStatement ps = con.prepareStatement("SELECT * FROM users WHERE id = ?"); ps.setInt(1, userId); --- Also learned today — CRUD operations: CREATE → INSERT INTO READ → SELECT UPDATE → UPDATE SET DELETE → DELETE FROM These four operations are the foundation of every backend application ever built. --- What clicked today — every app I have ever used stores data somewhere. Instagram saves your photos. Zomato saves your orders. Swiggy saves your address. JDBC is the layer that makes that possible in Java. 17 days in. The journey is getting more real every single day. 💪 Day 1 ................................................... Day 18 To any backend developer reading this — what was your first database connection moment like? Did it feel as satisfying as it did for me today? 🙏 #Java #JDBC #Database #MySQL #BackendDevelopment #100DaysOfJava #JavaDeveloper #LearningInPublic #100DaysOfCode #SQL #WebDevelopment #Programming
To view or add a comment, sign in
-
Full Stack Java Development is more than just knowing a few technologies—it’s about building a strong foundation and then layering the right skills in a structured way. Start with Core Java by mastering concepts like OOP, collections, exception handling, and multithreading. Once the basics are solid, move into Spring Framework, especially Spring Boot, to understand how modern backend applications are built. Alongside this, focus on REST API development, Microservices architecture, and tools like Maven or Gradle. At the same time, develop a strong understanding of databases such as MySQL or PostgreSQL, and learn how to integrate them using JPA and Hibernate. On the frontend side, gain hands-on experience with HTML, CSS, and JavaScript, then progress to frameworks like React or Angular to build dynamic user interfaces. A complete roadmap also includes working knowledge of Git, Docker, and CI/CD pipelines to handle real-world deployments. Exposure to cloud platforms like AWS or Azure will further strengthen your profile. The key is consistency—build projects, understand system design basics, and keep refining your problem-solving skills. Full stack development is a journey where depth and practical experience matter just as much as breadth. #SeniorFullStackDeveloper #Java #Spring #SpringBoot #SpringMVC #SpringSecurity #SpringCloud #SpringDataJPA #Hibernate #Microservices #RESTAPI #OAuth2 #JWT #OpenAPI #Swagger #DesignPatterns #SOLIDPrinciples #Angular #AngularMaterial #NgRx #React #Redux #ReduxToolkit #VueJS #TypeScript #JavaScript #HTML5 #CSS3 #WebDevelopment #WCAG #AWS #AmazonWebServices #Azure #MicrosoftAzure #GoogleCloud #GCP #CloudComputing #CloudNative #Kubernetes #Docker #GKE #GoogleKubernetesEngine #AKS #EKS #Containerization #Orchestration #Helm #CloudInfrastructure #DevOps #CICD #Jenkins #GitHubActions #GitLabCI #AWSCodePipeline #Terraform #Automation #ReleaseEngineering #PostgreSQL #Oracle #MySQL #MongoDB #Cassandra #Redis #DynamoDB #SQL #NoSQL #DatabaseOptimization #ApacheKafka #EventDrivenArchitecture #PubSub #MessageQueues #Prometheus #Grafana #ELKStack #Elasticsearch #Logstash #Kibana #Splunk #AppDynamics #CloudWatch #Observability #JUnit #Mockito #Cucumber #CloudSecurity #IAM #ZeroTrust #APISecurity #SecureCoding #MicroservicesArchitecture #DistributedSystems #ScalableSystems #HighAvailability #FaultTolerance #PerformanceEngineering #Agile #Scrum #JIRA #Git #VersionControl #C2C #Remote
To view or add a comment, sign in
-
-
Day 22 —#100DyasJava today I learned something that every backend developer uses but most beginners never hear about. ☕ Connection Pooling. And specifically — HikariCP. --- First, the problem. Every time your Java app needs to talk to a database, it has to open a connection. Opening a database connection is not free. It takes time. It takes memory. It takes CPU. Now imagine 1000 users hitting your app at the same time. If every request opens a new connection — your database crashes. Your app slows to a crawl. Users leave. This is a real problem that every production system faces. --- Connection Pooling is the solution. Instead of creating a new connection for every request — you create a pool of connections at startup. Say 10 connections. When a request comes in, it borrows one from the pool, does its work, and returns it. The next request gets that same connection. No new connection created. No overhead. Same connections reused hundreds of thousands of times. This is how every real backend application handles database traffic. --- HikariCP is the best connection pool library in the Java ecosystem. It is the default connection pool in Spring Boot. That alone tells you how trusted it is. It is faster than every alternative — Apache DBCP, C3P0. Less memory. Less configuration. More stable under heavy load. The basic setup is simple: HikariConfig config = new HikariConfig(); config.setJdbcUrl("jdbc:mysql://localhost:3306/your_db"); config.setMaximumPoolSize(10); HikariDataSource dataSource = new HikariDataSource(config); Connection con = dataSource.getConnection(); That is it. Your app now has a proper connection pool. --- The configurations that matter in production: maximumPoolSize — how many connections can exist at once. Set this based on your database capacity. minimumIdle — minimum connections kept ready even when traffic is low. connectionTimeout — how long a request waits before giving up if no connection is available. idleTimeout — how long an unused connection sits before being removed. maxLifetime — maximum age of any connection before it is replaced. Most beginners use defaults. Senior developers tune these based on actual load, query time, and database limits. That is the difference. --- The question this made me ask today — how will my app behave under 10,000 users? That is a system design question. Most beginners never think about it. Today I started thinking about it. 21 days in. The learning is moving from "how do I make this work" to "how do I make this scale." That shift feels significant. 🔗 GitHub: https://lnkd.in/gbWkSnBu Day 1 . . . . . . . . . . . . . . . . . . . . . Day 21 To any senior developer reading this — how do you tune HikariCP in production? What pool size do you recommend for a typical Spring Boot app? I am genuinely curious. #Java #HikariCP #ConnectionPooling #BackendDevelopment #100DaysOfJava #JavaDeveloper #100DaysOfCode #JDBC #PerformanceOptimization #
To view or add a comment, sign in
-
🚀 Built a production-grade Agentic Search Service from scratch using Spring Boot 3 + LangChain4j What started as a simple CRUD API evolved into an intelligent search system that decides HOW to search based on what you ask. 𝗪𝗵𝗮𝘁 𝗶𝘀 𝗔𝗴𝗲𝗻𝘁𝗶𝗰 𝗦𝗲𝗮𝗿𝗰𝗵? Instead of always running the same query, the system classifies your intent first — then picks the right strategy automatically. "laptop" → keyword search "something portable for work" → semantic vector search "laptops under 500 with 16GB" → LLM extracts filters → structured query "good stuff" → asks for clarification 𝗧𝗲𝗰𝗵 𝗦𝘁𝗮𝗰𝗸 → Spring Boot 3 + Java 17 → LangChain4j + Groq (llama-3.3-70b) for intent classification → AllMiniLmL6V2 local embedding model (zero API cost) → pgvector on PostgreSQL for semantic similarity search → Redis for distributed caching → Apache Kafka for async write pipeline → HikariCP with primary/replica DB routing → Docker Compose for local infrastructure 𝗣𝗲𝗿𝗳𝗼𝗿𝗺𝗮𝗻𝗰𝗲 𝗙𝗲𝗮𝘁𝘂𝗿𝗲𝘀 → @Transactional(readOnly=true) routes reads to replica automatically via LazyConnectionDataSourceProxy → Redis cache with toggle flag — on/off without code changes → Kafka async writes with 202 Accepted — DB pressure decoupled from API latency → Paginated reads with configurable sort → Input validation with field-level 400 error responses 𝗞𝗲𝘆 𝗗𝗲𝘀𝗶𝗴𝗻 𝗗𝗲𝗰𝗶𝘀𝗶𝗼𝗻𝘀 → LazyConnectionDataSourceProxy — without this, read/write routing silently breaks → AOP proxy ordering — @Transactional must wrap before @Cacheable fires → Embeddings generated at write time, not search time — semantic search stays O(1) → Kafka/cache toggleable via properties — same codebase, different behaviour per environment 𝗪𝗵𝗮𝘁 𝗜 𝗟𝗲𝗮𝗿𝗻𝗲𝗱 Building this end-to-end showed me that the gap between a working API and a production-ready service is filled with decisions most tutorials skip — connection pool tuning, proxy ordering, embedding lifecycle, broker networking in Docker. The agentic layer on top made it clear how LangChain4j's AiServices turns an LLM into a typed Java method — no boilerplate, no JSON parsing, just an interface and annotations. #Java #SpringBoot #LangChain4j #AI #Kafka #Redis #PostgreSQL #pgvector #SystemDesign #BackendEngineering
To view or add a comment, sign in
-
Running native queries in Spring? Hibernate's L2 cache doesn't know about native queries. When you bypass JPQL and fire raw SQL directly, Hibernate has no way to track the changes made. The stale data quietly stays in the cache. The fix: clearAutomatically = true on @Modifying @Query( value = "UPDATE products SET price = :price WHERE id = :id", nativeQuery = true ) @Modifying(clearAutomatically = true) @Transactional int updatePrice(@Param("price") BigDecimal price, @Param("id") Long id); This evicts the entire persistence context (L1 cache) after the bulk update. So the next read goes back to the DB. Real-world scenarios where this matters: 1) Bulk price updates in e-commerce (without clear, the old price is returned) 2) Soft deletes via native UPDATE (cached entity still shows as "active") 3) Batch status/audit flag changes (stale flags break downstream logic silently) 4) In-app data migrations running native DML during startup or scheduled jobs Advantages: 1) Prevents stale reads after bulk DML 2) Keeps L1 cache consistent with DB state 3) Zero boilerplate - one annotation flag Disadvantages: 1) Evicts the entire persistence context - not just the affected rows 2) Triggers an implicit flush before eviction (can surface unexpected dirty checks) 3) Hurts performance in large, long-running transactions 4) Does not invalidate L2 cache (Ehcache, Redis) - handle that separately Note: Use clearAutomatically = true whenever native bulk DML touches data already loaded in your persistence context. For L2 cache invalidation, pair it with @CacheEvict or a manual eviction call. #Java #SpringBoot #SpringDataJPA #Hibernate #SoftwareEngineering
To view or add a comment, sign in
-
Becoming a Full Stack Java Developer is not about learning everything at once, it’s about building depth in the right areas and connecting the pieces over time. Start with strong fundamentals in Core Java, focusing on object-oriented concepts, collections, multithreading, and memory management. Once the base is solid, move into the Spring ecosystem—especially Spring Boot, Spring MVC, and Spring Data JPA—to understand how real-world backend systems are designed. At the same time, get comfortable with REST APIs, SQL/NoSQL databases, and basic system design concepts like scalability, fault tolerance, and clean architecture. On the frontend side, pick one framework like Angular or React and learn how to build responsive, accessible UI using TypeScript, HTML, and CSS. From a senior developer’s perspective, what sets you apart is not just coding skills but how you think about systems. Learn how microservices communicate, how to secure applications using OAuth2 and JWT, and how to deploy using Docker, Kubernetes, and cloud platforms like AWS. Get hands-on with CI/CD pipelines, logging, and monitoring tools because production experience matters more than theory. Build projects that solve real problems, understand trade-offs, and focus on writing clean, maintainable code. Over time, you’ll move from just building features to designing systems—and that’s where the real growth happens. #SeniorFullStackDeveloper #Java #Spring #SpringBoot #SpringMVC #SpringSecurity #SpringCloud #SpringDataJPA #Hibernate #Microservices #RESTAPI #OAuth2 #JWT #OpenAPI #Swagger #DesignPatterns #SOLIDPrinciples #Angular #AngularMaterial #NgRx #React #Redux #ReduxToolkit #VueJS #TypeScript #JavaScript #HTML5 #CSS3 #WebDevelopment #WCAG #AWS #AmazonWebServices #Azure #MicrosoftAzure #GoogleCloud #GCP #CloudComputing #CloudNative #Kubernetes #Docker #GKE #GoogleKubernetesEngine #AKS #EKS #Containerization #Orchestration #Helm #CloudInfrastructure #DevOps #CICD #Jenkins #GitHubActions #GitLabCI #AWSCodePipeline #Terraform #Automation #ReleaseEngineering #PostgreSQL #Oracle #MySQL #MongoDB #Cassandra #Redis #DynamoDB #SQL #NoSQL #DatabaseOptimization #ApacheKafka #EventDrivenArchitecture #PubSub #MessageQueues #Prometheus #Grafana #ELKStack #Elasticsearch #Logstash #Kibana #Splunk #AppDynamics #CloudWatch #Observability #JUnit #Mockito #Cucumber #CloudSecurity #IAM #ZeroTrust #APISecurity #SecureCoding #MicroservicesArchitecture #DistributedSystems #ScalableSystems #HighAvailability #FaultTolerance #PerformanceEngineering #Agile #Scrum #JIRA #Git #VersionControl #C2C #Remote
To view or add a comment, sign in
-
-
🚀 Day 29 – Java Backend Journey | Kafka Core Concepts Today I strengthened my understanding of Apache Kafka core concepts, which are essential for building event-driven and real-time backend systems. 🔹 What I learned today Kafka is a distributed event streaming platform that enables systems to communicate using messages (events) in a scalable and fault-tolerant way. 🔹 Core Concepts I explored 1️⃣ Producer Sends messages (events) to Kafka topics Example: kafkaTemplate.send("user-topic", "User created"); 2️⃣ Consumer Reads messages from topics and processes them @KafkaListener(topics = "user-topic") public void consume(String message) { System.out.println(message); } 3️⃣ Topic A logical channel where messages are stored Example: user-topic 4️⃣ Broker Kafka server that manages topics and handles message storage 5️⃣ Partition Topics are divided into partitions for parallel processing and scalability 6️⃣ Offset Each message in a partition has a unique ID (offset) used to track consumption 🔹 How Kafka works Producer → Topic (Partitioned) → Consumer Messages are: • Produced • Stored in partitions • Consumed asynchronously 🔹 What I understood • Kafka enables asynchronous communication • Helps build loosely coupled systems • Supports high throughput and scalability • Ensures fault tolerance and reliability 🔹 Key takeaway Understanding Kafka core concepts is crucial for designing real-time, event-driven backend systems, especially in microservices architectures. 📌 Next step: Explore Kafka partitions, offsets, and consumer groups in depth. #Java #SpringBoot #Kafka #BackendDevelopment #Microservices #EventDrivenArchitecture #SoftwareEngineering #LearningInPublic #JavaDeveloper #100DaysOfCode
To view or add a comment, sign in
More from this author
Explore content categories
- Career
- Productivity
- Finance
- Soft Skills & Emotional Intelligence
- Project Management
- Education
- Technology
- Leadership
- Ecommerce
- User Experience
- Recruitment & HR
- Customer Experience
- Real Estate
- Marketing
- Sales
- Retail & Merchandising
- Science
- Supply Chain Management
- Future Of Work
- Consulting
- Writing
- Economics
- Artificial Intelligence
- Employee Experience
- Workplace Trends
- Fundraising
- Networking
- Corporate Social Responsibility
- Negotiation
- Communication
- Engineering
- Hospitality & Tourism
- Business Strategy
- Change Management
- Organizational Culture
- Design
- Innovation
- Event Planning
- Training & Development