The Bleeding Edge - Part I

The Bleeding Edge - Part I

My CEO said that I am an empirical learner.  This he mentioned could be a double edged sword since I tend to understand the details quite well, but I may take a while to assimilate information I could try to get through people elsewhere.

However, one place that I do find empirical learning necessary is as a CTO understanding the changes in technology.  It is one thing to read blogs, follow the maelstrom of technical literature and do one's best to keep abreast of trends and another thing completely to dive and and play with it oneself.  

Every few months I pick a project whether it be consulting, personal or a favor for a friend and I try to do it the hard way.  Not as a means of inflicting pain and frustration on myself, but as a way of exploring some really bleeding edge implementations of projects that interest me.  There are always trends in software and in architecture and I have been around long enough to be able to see through the hype.  At the same time I cannot dismiss some trends I find worth a more critical view.  

So I had an opportunity to take a set of data cleansing scripts I had done for a company and turn them into a production system.  I could have taken the easy route and picked something like Grails 2.4.x and apply it to that project and be done in short order, but I took the hard way...

I wanted to explore Angular's new Component system on the UI side.  I have been using Angular 1.4 on some consulting projects and felt that Angular 1.5 with the new Component Router would be an introduction to the concepts being introduced by Angular 2.0.  On the back end I decided to go with Grails 3.1 to use the new Asynchronous mechanisms, Spring Boot and to try out Websockets as the means for the backend and frontend to communicate.  The nature of the problem is that the data cleansing takes a fairly large dataset and runs a series of steps to prepare the data, do some transforms and then run some neat Lucene fuzzy matching on it to try to find certain anomalies.

Part II will deal with the cuts and scrapes I learned building the backend and Part III will cover the bruises and bumps I acquired learning the frontend.

Hey Michael, cool post! Assuming your front- and back-end are distributed, can you describe how the reactive/asynchronous programming using web sockets works through the load balancer? Speaking of bleeding edge, Angular 6 is already out. Looking forward to pt 2 and 3.

Like
Reply

http://aurelia.io/ looks interesting and by the guy that ditched Google over differences in how to design Angular2. Still early, but you know the trend, 2 days in without a new JS framework is a good trend. Also if you do like full stack-ish stuff Meteor is pretty cool, lots of nice things to jump start with Angular or React. Me... I'm holding out for the PHP of JS frameworks :)

Like
Reply

I used angular js for our internal application. But for my media network, JavaScript framework is not seo friendly. I am looking at vertx and I heard that it addresses the shortcomings of node.js

Like
Reply

I'll add some more detail in the follow up posts, but the larger trends I am looking at are reactive/asynchronous programming on the backend and and how it is supported on a particular framework, websockets/STOMP for long lasting communication to the frontend, and the move to component based UI via Angular components (which is a response to the Reactjs framework and a response to the messiness of building larger applications in Angularjs). As with most things, I have picked a starting point with which I am comfortable - the Groovy/Grails stack and Angular to avoid starting at square zero. I could have easily used Nodejs, SpringBoot (which actually powers Grails 3.x). It would have been less useful to jump stacks to RoR or something that is unfamiliar to me completely.

Like
Reply

One thing that is challenging is that there are so many hammers available to drive the nail, how do you arrive at the right combination. Many of these technologies can involve a fair amount of grunt work and can be time consuming to get right, that if you pick the wrong one it could be some time before you know it. We live in awesome times given the large number of solutions available, but much more complex than the days where you would pick up K&R and install the C compiler in the floppy in the back. :)

Like
Reply

To view or add a comment, sign in

More articles by Michael Amster

  • AI Assisted Coding - Longer Term View

    This is the 4th article in my series on AI assisted coding. I have now been through writing 2 AI assisted SaaS…

    2 Comments
  • So I made a thing...

    Some people know I am an avid audiophile. I have collected vinyl since I was a teen and I'm pretty lucky that I still…

    27 Comments
  • My Idiot-Savant Pair Programming Partner

    Part 3 in my AI Assisted Programming Series Now that I have settled into AI Assisted coding, I would have to summarize…

    12 Comments
  • Start Fast, Then Slow Down

    This is the second article in my series AI Assisted Coding - I was excited to have a green field opportunity to apply…

    3 Comments
  • AI Assisted Coding - Part 1 The Background

    Like many other technology leaders, I have been pushed, cajoled and piqued by the move to AI assisted coding. At Visa…

    11 Comments
  • How Being a Yearbook Editor Prepared Me for Technical Management

    As I watched my Yearbook Co-Editor daughter stress and struggle with the final stages of her book, I thought back to my…

    1 Comment
  • Learning Mid-Career - Coursera Data Science

    As a VP of Engineering at Ticketmaster, I was working on a number of projects that leveraged Data Science techniques…

    9 Comments
  • Playing with SVG

    I have been working on my site to bring it up to modern standards including responsive design. Though I was happy with…

    1 Comment

Others also viewed

Explore content categories