Docker for finer grained DevOps

While working with AWS’ rudimentary image bootstrapping, allowing me to either boot and configure from a supported image, or directly boot from our own custom image, I came to realize the price and frustration for this archaic mechanism of bringing up a new operational node to scale out or update/rollback nodes. There had to be a better way.

So I started looking around for other ways of deploying and managing infrastructure. And there was Docker! It was a couple of months old, but I was sure it would take the world by storm and started experimenting with it. It would allow me to build one image with all the necessary infrastructure to run an app, and deploy it everywhere! And if I needed to upgrade part(s) of the infrastructure, I could do so very easily, and just have my nodes update by pulling in diffs! Super cool!

Now I knew I was slowly being sucked into DevOps land, but just had to go with my guts and explore this beautiful new territory, even tho it wasn’t my core expertise I was building on. This attitude allowed me to dive right in and get to know the ins and outs and the do’s and dont’s of building docker architectures. I don’t want to give detailed instructions how to do things on this blog, because there is enough of that to be found, but let me just do what I do best, and that is to inspire others to try the stuff I am excited about.
And if it’s one thing I am very excited about, it is Docker and this whole new movement in DevOps land, with such things as CoreOS utilizing automated centralized configuration managment such as EtcD. There’s a whole slew of PaaS offerings coming our way, and our developers lives will be made a whole lot easier thanks to the initial work of the dotCloud people 🙂

Event store with Node.js and AWS

It’s been a while since I posted anything here, but a lot has happened on the front. I will give a quick update about the things that have interested me since then

In 2013 I created my first auto scalable event store architecture for a huge client in Node.js, involving custom web servers receiving events from different endpoints in different formats, meta-tagging them and then injecting them into amazon queues, with processors on the other end enriching and transforming the events for storage in AWS DynamoDB. Post processors would be run periodically to store aggregates in S3. It was required to auto-scale to handle 200.000 events per second. (Yes, you read that right). I created a stateless architecture with the code for all the roles (server, processor, post-processor etc), built into one repo, which would be tarred and deployed onto S3 by our Bamboo server, to allow new nodes to be bootstrapped with that. The node itself was already booted by puppet with a role to perform, and thus knew it’s role to play. For hot upates and rollbacks we’d tell a Saltstack master to update a certain range of nodes, which would then pull the wanted source from the S3 registry again and update themselves without downtime. Pretty nifty, but rather proprietary.

The company I worked for used Puppet for configuration management, but also for app deployment, which I thought was the wrong approach. Puppet is imo not designed for realtime deployment, but rather for booting and maintaining vm’s from config. That is how I came across Saltstack’s powerful realtime command capabilities, and decided to script our deployment process to be controlled by Saltstack. I actually haven’t updated on that front in a long time, but I saw it fit the bill for our needs and I was so bold to build it into our POC.

Too bad we hadn’t learned about Google’s Go language back then, otherwise I would have scratched myself behind the ears and probably opted for that, instead of Node.js for our highly concurrent applications.

OSX – use the force

Ever since I was young I knew that I would be always automating a lot of my computer work, in order to achieve things better and faster. My motto became: fix it when it breaks, or needs fixing, and do it asap. That way you benefit immediately, and will save you time in the future.

After having tried Windows and Linux, I arrived at Mac OSX, which has more powertools than the other OS’s, and has the most user friendly Graphical User Interface. And all that on top of a UNIX base, allowing me to hack away in the terminal.

I have just updated my OSX apps section, where I list most of the apps that are essential to my everyday computing. I thought it wise to share ’em with you all.

To give your even more power user skills, I have also published Google documents of my last OSX power course training here:

Use the force Luke!

 

It’s all coming together now

I was asked to speak for a small group of people on behalf of my Javascript expertise at a meeting of frontend web developers. Of course I said yes, and started thinking about it. I wanted to use my newly learned lessons from The Art of Hosting. I also told my host I wanted to take a personal approach, and would like to include my own stories. I said I wouldn’t be needing a beamer or flipover. He was very interested and let me go my way.

When we went to the presentation room, I saw a round table. Just big enough to host us all. We were with 9 people, so that created an intimate space.
I started with a check-in and asked each person to tell us who they are, what inspires them, and what they’d expect from the evening.
When the first person started all became engaged, and we listened with interest when each took their turn. Some talked about their professional self, others took a more personal route. Wonderful!
I felt my body relaxed, my mind clear, and was able to truly listen to the others, and get familiar with their faces. Questions were asked from genuine interest.

Finally I told my own story. I was able to be fully at ease and look everybody in the eye. That was a first for me, and I attribute that to the initial check-in round. And the small group as well.
I talked about my personality, my lack in degrees, my initial insecurities because of that, and how I overcame that by reading whatever I could about my area of expertise. About how I got to know myself better, enabling me to become more solid and gain integrity. That I sometimes need to manage my overenthusiasm.
The group responded multiple times by asking questions and recognitions.

In essence I was telling authentic stories, exposing my weaknesses, and how I gained strength by accepting and getting to know them. Their faces told me they were intrigued, sometimes amazed, but all of them were engaged. Some faces started showing minor agitation, which I think was some impatience for my build up, or the mismatch with their expectations.

So to move them towards the topic of the evening (Javascript) I then went on to speak about the moment I fell for Javascript, and who was on my path to inspire me.
I talked about my open source project called backbone-everywhere, and that it was meant to be a demo for a startup. This project involves bleeding edge open source javascript, which is common ground to most of us. So we ended with a discussion about our area of interest, which we all hoped to talk about.

Afterwards I asked them for feedback, what they thought about the format, about my way of hosting.
They all preferred our participatory setup over a regular presentation, and felt energized.
Because I hardly got any critical feedback I kept asking for it.
One person then told me he got a little bit frustrated for not knowing how and where it would go. I thanked him, and explained I am learning to detect such signals in the heat of the moment, so I can ask what is needed.
Some people told me that more structure in the informative section would be nice. I agreed there. The lack of a visual presentation gave them a new experience and engagement, but I realize that I should have some visual structure for stories that involve lots of technical aspects.

When most of the people were gone, I remained with the two intitators of the evening. They were very enthusastic about my approach and we talked setting up a new Javascript course together.
What an energizing and fruitful night! It’s wonderful to see everything come together, and life aspects seeping into work, and vice versa.

 

Android hemroids

I am a genuine late adopter of modern technology. I do see a lot of it coming from miles away tho, and try to get in touch with it before it arrives. Read up on it, let it sink into my stubborn brain cells.
And so I fell in love with Android long time before it was found on any commercial device. As a programmer I immediately fell for it’s architecture, it’s intents, it’s openness.
While waiting for the baby to mature, I read up on the first user experiences and decided to wait a year more, before trying it out myself. Being very fond of my iPhone User eXperience, and a Linux user, I was very reluctant to try another open source OS.

So I finally decided in 2010 to buy the Goog’s Nexus One, and treat myself to a whole different mobile experience. No cruft, straight up android from the core.
Nice new features, integrated messaging and notifications, free navigation, supported rooting, a dream come true from a programmer’s perspective!

But then the wet cement started coming up through the cracks, reminding me of the slow, shaky and ever evolving Linux OS’s. Why would it be different with Android? Marketing? A larger user base? Of course not. The foundation is the same. Openness breeds variety, and the lack of control allows buggy software. It just kills the user experience when there is no senior top of the pyramid overlooking a coherent interface to it’s OS, but also it’s ever growing list of apps.
Android’s choice to let performance measures come from the community itself, rather than exert dominant control over such an important aspect, resulted in an unresponsive and sluggish device. Trying and uninstalling many task killer and performance apps trained me to keep it into shape somewhat, but how cumbersome!
But the most irritating to me was the fact that the Android market became the waste bucket of successful iPhone apps, with even the largest web services out there offering alpha software. Not only were most of them very buggy, but it seemed that the android user base was treated like the linux user base, expected to not care about user experience, but openness and features only.

And that is what made me sell the device after 2 months of trying very hard to make the device perform acceptably, and go back to my good ol’ iPhone 3Gs.
So I decided to give Android more time to mature, and hoped the inevitable growth of it’s user base would demand top notch user experience. Boy was I wrong.

2 years after my first encounter I now own a quad core Asus eee prime monster tablet running the latest ice cream sandwich, holding more power than necessary for a smooth user experience. Or so I thought.
ICS 4.0.3 is still not controlling performance and has apps running wild, interfering with my user experience.
The apps I use most, such as Facebook, Spotify and Twitter, are all cream of the crap. Offering the same crappy UI and limited functionality like years before. But I can’t really blame app developers for not wanting to support an OS that does not deliver the same functionality on the multitude of devices churned out every day. I do have to see that as a given from now on I guess.

What was I thinking? I should have realized that the same lack of control over performance and apps, and also the increasing complexity of hardware support are truly hindering front of the line, top notch mobile device experiences.
Apple has been criticized by that same android community for exerting this kind of control, and app manufacturers have been honed for only supporting Apple’s stable unified hardware approach. But being an open source advocate, I am also a power user in need of a user experience allowing my quick and intuitive workflow. My day to day operations are not to be hindered by sluggy OS’s and unusable apps.
There is no choice for me but to go back to Apple’s stable, and ride their willing and able iron horse, taking me into the camps of the fronteers, letting me indulge in their nourishing stream of app cream.

Iktami Devaux writes amazing stuff

When I met my friend Iktami Devaux for the first time in El Bolsón, Argentina, I immediately fell in love with him (no, I am not gay, and he was already a beardy old man in his mid fifties). With his openness, witty outlook, spirituality (yet firmly grounded in reason) and liveliness he was talking directly to my inner self. With his body slowly degrading, like a failing machine grinding to a halt, he saw his sickness (a rare form of arthritis) as his master, pushing him to take his plans with life to another level. And plans he has. To keep his sanity he has to keep writing down his thoughts, in the form of reflections, as well as full blown books.

After having read the first collection of his short stories -“Through the wilderness of love”- I also became a fan of his work. His writings were so much in line with the path I had set out for myself, being in a relationship with my fellow beings and nature, I could not get around it. But in the six years thereafter, I was distracted by my own life and work, and we found ourselves sharing thoughts only once a year sometimes.

Until he translated “El arte de no hacer nada” into “The art of doing nothing”, which I had the pleasure of reading in 2011. It immediately sparked the fire that lay dormant within me, and I asked for more. I also decided I would make him a website to sell his books. Until then he had only been selling them at the fair of El Bolsón.

Well, that work is now finally done, and I am proud to present to you his website, which is fully bilingual:

iktami.com

Because I truly believe his writings should reach as wide an audience as possible, I asked him if I could distribute “The art of doing nothing” as a PDF for free. And so he agreed! I quickly converted his rich text document to a pdf, so here’s the link to it on dropbox:

The art of doing nothing – free PDF version.

Please read it and tell him what you think (you can comment on his website). He loves to communicate and communicates love 🙂
(And if you really like it, you can make a donation on this website.)

Backbone everywhere

I finally put my newly built Node.js MVC stack on github! You can download it here: backbone-everywhere.

What’s so special about it? Here’s my list of exciting features:

  • Pages are rendered on the Node.js server by Backbone and jQuery.
  • All script resources are bundled by browserify and fileify,  uglified by uglify, and gzipped by connect-gzip for fast loading and deployment on other possible javascript environments.
  • The entire Backbone MVC stack works on the server, and is loaded in javascript enabled browsers to take over from there.
  • The app state is reflected in the url by means of HTML5’s pushState, or using hash notation when not supported.
  • The same app state is regained for non-javascript browsers pulling full requests from the server, so no worries about SEO!
  • All client / server communication is handled by socket.io (ajax is sooo 2009) and subscribed clients are updated with published models.
  • A JSON-RPC service listening to ‘/api’ requests, with an easy to build on service layer. Handy for non-web contexts such as mobile devices.
  • All data is persisted in Redis through an adaptation of mine of backbone-redis, enabling indexing, sorting and  foreign key lookups.

For me this is a whole new approach at engineering web applications, but I think I’ve managed to get a grip on it.
Not only that, it gave me a great impulse to reconnect with the pioneers of tomorrow. Because what I have done was build on top of the stuff from people with great vision.
Big shout out to the open source community, and the people willing and wanting to share. The sum of it’s parts will eventually overcome the current patent trolling paradigm.

What are you waiting for? Dig in!

Node.js is the future of web dev happening now

Having been a javascript fan since I started working with it a long time ago, I immediately fell in love with Node.js. Having discarded Jaxer earlier as too proprietary, even though it offered a solid mechanism for code reuse on the client, I have now adopted the uber active Node community.

You see, me and my partner decided to build a new and exciting community site/app (of which I have to keep the details secret for now). So when I started designing our new web 3.0 app with accompanying mobile app, I thought about and thoroughly  investigated the possible frameworks out there. We decide to build a single page application that would work in all javascript clients, and would gracefully degrade to server roundtrips when javascript was not available (but also to enable deep linking for SEO). I quickly decided to discard most PHP frameworks for our MVC setup. Not only because of the fact that those were mostly too bloated or complex, but also because it implied having to recode a lot of functionality for the client. Of course I also favor the stateful and event driven possibilities of javascript, so I made the paradigm shift and chose the Node javascript stack.

With all the Node modules out there it is finally possible to create a full MVC framework operating on the server as well as in the client, in the form of backbone.js with the help of bones (a Node backbone server implementation offering code reuse in the client).
And with the help of HTML5’s new pushState, we don’t have to worry about breaking Google’s need for deep linking. (Those looking for an example with gracefully degrading URL’s, take a look at the jquery address plugin.)
More goodies come in the form of the browserify module, enabling us to optimally pack all our resources for client side usage, even all our templates and other static files!

Some more info for those interested:

With regards to storage, we decided to go for Cassandra, since we expect a lot of writes, and a lot of scaling. In the mean time I am hoping somebody will come up with a nice abstraction on top of the new Cassandra CQL language, since there are already some Node modules out there working with it.

Taking sessions into account, I am currently favouring Redis, which also has a nice pubsub layer. But I haven’t investigated that path fully yet.

In the mean time I am working on an iphone demo in Appcelerator’s Titanium. Too bad it doesn’t support all the functionality we need on Android as well.

That’s all for now. I have to try and curb my enthusiasm, as all this goodness may come at the expense of my sanity due to sleep deprivation.

My love for Paypal just died a little.

After having put some days of work into creating my friends book site (iktami.com), I decided to keep it simple and implement Paypal’s “Add to cart” buttons.
Wrong choice.

You see, I was under the impression that Paypal’s shopping cart would be able to handle shipping cost based on weight. But no, it isn’t. Well, only if you’re a merchant residing in the US.

What? I thought computers and the internet would help bring an end to discrimination and suffering.

What could be the reason for this? I am sure it’s not a technical problem, because that little and simple piece of code is agnostic of location. And it isn’t money, because they would probably make even more if they didn’t piss off merchants outside the US.
Maybe they just don’t care anymore, having all that money, and getting (re)tired of it all.
Or, could it be some political hidden agenda? These things pop up everywhere these days.
I am probably missing something here.

Taking PHP development to the next level

A lot of my enthusiasm for programming comes from my never ending interest in it’s evolution. I played with computers since I was 7, and already knew at age 13 that I wanted to become active one day in the ‘automating’ industry. I’ve always regarded programming as a trade or tool to automate repetitive tasks, but also to offload (error prone and inconsistent) people tasks to fool proof automated systems. I just wanted to make stuff easier and better, and found a way to do it.
I truly started my programming career in 1999 as a Java programmer in the corporate world, learning a lot about real world software problems. It made me realize a lot of things.
But even in the year 2000 internet applications were not widely recognized as “the way to go”. My employer, one of the best amongst Dutch knowledge engineering services, did not pay full attention to it yet.
So I left. I simply had to find out more about web technology, which I regarded as the future of software development.
I carefully chose to follow the ‘open source’ path (as opposed to the closed MS world), and divided the territory into “corporate vs fun and dynamic”. I ended up in PHP land.

But soon I found out about the drawbacks of PHP development. It’s a loose typed language, with a scattered user base, supporting all kinds of immature frameworks and libraries, for all the wrong reasons. Simply stated: PHP land was short on inhabitants with extensive knowledge of software development in general. After a while though, PHP was recognized for its RAD abilities, and more and more people with skills came into the game. This lead to a myriad in yet more frameworks and libraries, but now it was all about documentation, participation, and a growing user base. Because solutions were being reviewed, tested and explained, quality started to drift to the surface.

But how does one recognize quality in this field?
Since I have never studied software development on a university or such, I will always regard myself as a noob on matters that I have no experience with. I have learned everything by reading books, and from material on the internet. But because I was aways so insecure about my knowledge, and truly interested in what I was doing, I never stopped reading. That’s how I discovered that it’s most beneficial to start learning the industries’ best practices and ‘patterns’, before formulating your own suspicions and concepts regarding programming solutions. Building onwards from that premise I feel I have finally grasped an overall intuition for what matters most, and which choices should prevail others. Of course there is no holy grail of software design, but given a business goal, context and limiting factors, one surely can sketch a fruitful path for future development.
Do I bore you already?
The point is that I think that higher level principles should be considered first, when making choices in software development. Therefore I was always drawn to development platforms that obey such principles, even though some of their subsolutions are plain awful.
Steep learning curves and incomplete documentation were arguments very low on my choice break down list. As I have a brain, and a debugger, I’d rather work with beautiful flexible software than some simplifying framework caving me in and making me jump hoops to do the in-ordinary.
(Please don’t respond with PHP framework ‘showdown’ arguments, as I think Sheldmandu has a nice take on that, even though I don’t follow his simplistic view on ‘inheritance is key’.)

That’s why, about 4 years ago, I chose to go with the Zend Framework, being the most mature and offering so much of what I thought was important. It still forced me to do a lot of work by hand, and a lot of their code base seemed rather inconsistent. I think mainly because they let it grow too quickly without proper reviewing user contributed code base additions. And as a result of that, I think a lot of their solutions are simply not well thought out (like forms and data access).
But I loved their coding style, and the way they used naming conventions to get stuff rolling.

Then, about two years ago, I stumbled upon FLOW3, a new PHP framework built by the developers of TYPO3. In my opinion it will be the most advanced and productive PHP framework in the near future, but that has to be seen as it needs to become widely adopted. It has implemented a lot of the best practices from the Java scene (annotations, AOP, dependency injection). Which I think is a good thing, as that scene has been dealing with the largest chunk in complex software development for a long while.
I concluded that it needed a lot more time to come of age.

Around that same time I started using Magento for my own webstore (project was killed over a refuted business deal), and after that I worked freelance for 2 years with it’s complicated, yet powerful CMS abilities. A lot of 3rd party extensions built by novice Magento developers were indirectly compromising a lot of the projects I worked on (in the form of performance degrading or bug introducing plugins), thanks to Magento being too flexible and not understood well enough.
But Magento made me realise something:
What’s the use of a framework or library, if it doesn’t offer concrete reusable solutions to real world problems? In other words, why could I not find a development platform crafted on top of these high quality frameworks and libraries I so much admired?
Magento at least offered me out of the box configurable layouts, views, themes, multi site functionality….bla bla, etc etc.

Now that I wish to work on more versatile projects, preferably using out of the box, proven solutions, I have to check my facts again. I don’t want to go back and reinvent so many wheels. What are my options? What CMS can I use to not have to repeat myself too much for every similar project?
Let me write down some of my requirements:

  • An intelligent caching layer, like Magento’s, with it’s hole-punching capabilities.
  • An annotation-based domain persistence layer, using POPO’s (plain old php objects), decoupling data storage and making persistence trivial. (Doctrine 2 might fit these needs.)
  • An easy to setup service layer or domain API, enabling easier development of client-server architectures. Javascript, apps and flash are my main clients these days.
  • Straightforward creation of sites, templates and content, facilitating a multi disciplinary approach and separation of concerns.
  • Inheritance based themeing, allowing a stack of theme overrides, like Magento’s upcoming themeing system.
  • Plugin based architecture, preferably with a controlling mechanism warning developers about slow and impeding functionality.
  • Many great working plugins fitting my business needs!
  • Possibly some elaborate publishing mechanism, allowing for the entire application (files AND data please) to be pushed onwards in it’s development cycle: DEV / TEST / USER ACCEPTANCE / PRODUCTION
  • A large user base and developer involvement ensuring stable releases.

(Please feel free to add to this list)

Lately I found that FLOW3 is approaching it’s first stable release (possibly september 2011). But not only that, the release of TYPO3 v5 will be based upon FLOW3. So I quite hope TYPO3 to be the CMS I am looking for.