It was the 2013 Microsoft MVP Global Summit the week before my birthday, and I had finally been able to make arrangements to attend. It was a great week of learning what comes next for Visual Studio and C#, and I also got to meet guys I've followed, read and talked to online: Scott Hanselman, Phil Haack, Scott Koon, and Laurent Bugnion to name a few.
But what completes the whole experience for me is meeting the guy who made C# the awesome language that it is right now. I'm talking about Anders Hejlsberg.
Anders Hejlsberg and me at the 2013 Microsoft MVP Summit
A totally geek moment and an apt birthday gift to a C# junkie like me.
Back in September 2012 I got an email from a friend as an introduction between myself and one of the managers at his company.
This friend of mine had been telling me for years about how great his company is and how much he enjoyed being there, and had invited me to join them then. However, it required a move overseas and, for personal and family reasons, that wasn't an opportunity that I took seriously.
The surprise then was finding out that the manager he had introduced to me had been tasked to set up an office here in Manila, and build a development team that adheres to the stringent standards of their company. Of course there was an implicit invitation to join their team as well -- but as I had a bevy of commitments back then, I opted to just help them any way I can in putting up the new office.
Fast forward 10 months later. I will be joining Readify as a Senior Developer after a three year hiatus from the Microsoft stack.
I had a great time with Cormant (as a returning consultant) and ITRS doing iOS, Java, and a bunch of other things completely outside of my prior experience, and the lessons I have learn delving in them are completely invaluable to me.
Assuring quality is everyone's job -- but programmers and testers alike forget that.
I have encountered all sorts of variations to where developer testing ends and Quality Assurance/Professional Tester testing begins. Some developers hack away carelessly at their code and throw it to QA letting them test it (often the first test the code encounters) and just wait for the bug reports to come, in other times QA personnel start filing bugs that aren't defined in the specification, and start treating programmers as "the cretins who introduce bugs". Entire wars erupt between these two sides of development.
Hiring professionals in the Dice survey placed Quality Assurance (QA) on the "low priority" side of the ledger. Do not expect this to change. These days, the tech industry seems to be following Google's lead and turning everyone into beta testers. Users are the ultimate quality assurance staff - and they don't get paid! [Brian Hall - 10 Technology Skills No Longer In Demand]
While the absence of testers dedicated to making developers cry seems like a good idea to software developers, the lack of staff dedicated to testing presents its own problems. There has to be a set of "fresh eyes" that will not see an application under the narrow focus developers tend to have. Implementing the "users as QA" principle also requires the very expensive overhead of analytics modules that are necessary to track user actions that caused a particular bug. Not to mention that the analytics stuff would probably have bugs on its own.
While I tend to dislike the traditional definition and role of Quality Assurance, I believe there are three areas QA professionals can focus on to keep themselves relevant in the industry: test automation, user experience, and domain expertise.
Test Automation - There is enough tedium in the QA process that is worth automating. Repetitive tasks should be relegated to a script, which frees up QA to focus on writing test cases and measuring the performance of applications instead of random bug hunting. Besides, even if you had access to thousands of cheap testers, repetitive tasks are best done by the very machines applications are tested for. There are countless tools dedicated to this effort, and many of them allow you to address Michael Hunter's You Are Not Done Yet Checklist [PDF] in an automated manner.
User Experience - If there's an area that could not be efficiently tested via automation, it would be the visual appeal and intuitiveness of an application. Quality assurance professionals that focus on usability, user experience and design, not just in terms of visuals but also in terms of workflow and ease-of-use will be very valuable in producing software that will have either a lower after-sales support overhead for enterprise software or mass appeal for commercial apps.
Domain Expertise - Many times the breaking point of software projects is the ability of a particular piece of software to follow specific business rules or accommodate new pieces of regulation that the app should comply with. Quality assurance professionals who could gain domain expertise and gain greater insight on an application's impact in the business side of the equation would be immensely valuable. Even in outside organizations working within that same domain.
Just as software development has already evolved far from being a solitary effort running punch cards and checking if the desired output is achieved, quality assurance has to evolve beyond merely using an application and reporting any bugs found. A QA professional possessing at least one of the above skills beyond normal tester job descriptions would be invaluable to any organization priding itself with shipping quality software.
With today's web and mobile technologies with varying user interfaces of all shapes, sizes and colors, it is easy to forget that once upon a time when someone said "computer" they meant facing a drab, single color, single font-size terminal that tends to hurt your eyes.
This dull vision of technology was changed by a young Steve Jobs, who in January 24, 1984 came out with a computer that was able to display much, much more than text. The Macintosh was the first in a long line of products that up to this day bear (almost) the same name.
It is probably reasonable to say that the term Graphical User Interface was, for the general consumer and personal computers, born that day.
Prior to the Macintosh user interfaces were merely textual, and the only other product on the market that had a GUI was the Xerox Star, which was a client-server system that required an office to shell out $50,000 to $100,000 and was definitely out of reach of the common user. Of course there was the Xerox Alto, basis for the Star and famous prototype at Xerox PARC where Jobs got his ideas in the first place -- that one never made it to market.
However, aside from just being a fancy computer with pictures, Steve Jobs was also very particular about "look and feel", and how the menus and buttons of an app worked together, and how layouts affected use, aside from how aesthetically appealing the Macintosh's appearance was. Now, while the term "User Experience" may have been coined in the mid-90s, it is clear that the concept of having an easy-to-understand and easy-to-use application, not just in terms of visuals but in terms of how it affected working with the software, was started way before it, and perhaps championed by Jobs with the Macintosh.
I'd like to wonder out aloud: was user experience invented with the Macintosh? Did Steve Jobs invent user experience? Or have I have been RDF'ed to think that way?
What do I mean? I have nothing on me yet. No specifications. No user stories. No real-world examples. Every signal swimming through my synapses related to my current project is hypothetical.
That's a problem -- at least for me.
Many of the languages, frameworks and concepts I've had to study before were either existing as a legacy system or at least had some sort of specification lying around. Before that, everything was just fun, games, and Hello World.
I learned BASIC off of a GW-BASIC manual as a kid trying to draw graphics on my monochrome screen. I learned C and Visual Basic for machine problems and projects at school. I learned HTML so I can make a website for my freshman class. I learned COBOL, C# and iOS building on existing projects at work.
It seems that the brain is invoking the YAGNI (You Ain't Gonna Need It) rule -- which teaches you not to implement application features you don't need yet, or you only think you're going to need. I'm going to use TDD on my new project, that's for sure. But because it knows any code I write now is throw-away code, my brain has a hard time cooperating.
It explains all those times I had a hard time studying in school -- I didn't think I was going to need any of those lectures in real life (I was mostly right). Too bad it took me a decade to realize it.
I never thought YAGNI is applicable to study and learning -- but here it is.
At the beginning of the book I thought Scott Rosenberg was going to fail in his narrative of a software development project. As a software developer the drivel and pain feels akin to a small boat dragging its bottom across a shallow reef. I had terrible feelings of depression all through the halfway point as I relive similar mistakes and experiences -- some my own doing, some by my colleagues, some by management -- on development projects I have been in.
Only when the book got to its final third -- somewhere around Chapter 9 -- did I start understanding why Scott had to get everyone through all that pain. It's particularly difficult to explain to non-developers why programming isn't as easy as they think it actually is, and Rosenberg HAD to bear that pain and point down on his readers to get that point across. It was in this final part of the book that I found joy and wisdom; not only am I not alone in my struggles as a software developer (most of the book brings this point across), but even the greatest minds in the industry have wracked their brains to solve this inherently difficult problem.
Particularly striking to me was the explanations on the works of Donald Knuth, who before this book I had only heard about, did I appreciate his own quest in creating something practical (his decade-long work on TeX) and feeding it in to his more esoteric pursuits (his lifelong work on "The Art of Computer Programming").
All in all I feel that this book is a must read for everyone working in the software industry, specifically because it compiles all the greatest resources on the subject matter -- from Frederick Brooks of Mythical Man Month fame to Watts Humphrey's CMMI movement to the formation of the Agile Manifesto and Extreme Programming to 37 Signal's lean approaches -- and Rosenberg successfully wire-walks these contradicting views with sufficient balance and absence of bias towards any of them.
I celebrated my 32nd round trip in the solar system last weekend.
In the morning we fed some kids from where the poorest of the poor of Manila live.
In the afternoon I helped in the logistics of a arts and music workshop for kids studying in a working-class neighborhood school for my uncle helps run.
I feel that it's important once in a while to join these kinds of activities. For all the problems that we think we have, when you put yourself in the shoes of these children their woes are more daunting and their world much bleaker than yours.
Then you look at them and realize that they still, somehow, manage to smile. Perhaps you can learn to smile at your problems too.
Some things should embrace the wave of massively online technology, like education:
It’s been interesting watching this unfold in music, books, newspapers, TV, but nothing has ever been as interesting to me as watching it happen in my own backyard. Higher education is now being disrupted; our MP3 is the massive open online course (or MOOC), and our Napster is Udacity, the education startup.
We have several advantages over the recording industry, of course. We are decentralized and mostly non-profit. We employ lots of smart people. We have previous examples to learn from, and our core competence is learning from the past. And armed with these advantages, we’re probably going to screw this up as badly as the music people did. [Clay Shirky - Napster, Udacity and the Academy]
I surmise it would take a long time before this actualizes, but the writing is pretty much on the wall. While traditional companies will still value diplomas and transcripts of records, the prevailing startup industry will start valuing skill more than degrees.
Some things are better left offline though, like say, military logistics pipelines crucial for fighter squadrons:
As the Marine Corps prepares to set up its first operational squadron of F-35s next week, some experts say other security risks may lurk within such a large and highly networked weapons support system.
One concern: Lockheed shored up political backing for the F-35 by choosing suppliers in nearly every U.S. state. But having such a large and widely dispersed group increases exposure to cyber attacks, said Ben Freeman, national security investigator with the non-profit Project on Government Oversight. [Andrea Shalal-Esa - Insight: Lockheed's F-35 logistics system revolutionary but risky]
This fiasco reminded a friend of Battlestar Galactica where the protagonists were able to fight back with older fighter craft because they weren't "networked" and weren't affected by a computer virus the enemy Cylons used to disable the Colonial protagonist's systems.
There are, however, some things that are just plain badly implemented -- like the Romney campaign's late voter system:
Called "Orca," the effort was supposed to give the Romney campaign its own analytics on what was happening at polling places and to help the campaign direct get-out-the-vote efforts in the key battleground states of Ohio, Florida, Pennsylvania, Iowa, and Colorado.
Instead, volunteers couldn't get the system to work from the field in many states—in some cases because they had been given the wrong login information. The system crashed repeatedly. At one point, the network connection to the Romney campaign's headquarters went down because Internet provider Comcast reportedly thought the traffic was caused by a denial of service attack.
"The end result," Ekdahl wrote, "was that 30,000+ of the most active and fired-up volunteers were wandering around confused and frustrated when they could have been doing anything else to help. The bitter irony of this entire endeavor was that a supposedly small government candidate gutted the local structure of [get out the vote] efforts in favor of a centralized, faceless organization in a far off place (in this case, their Boston headquarters)."[Sean Gallagher - Inside Team Romney's whale of an IT meltdown]
I'm amused whenever someone complains that there is a shortage of "common sense" in the world.
Common sense in its essence is formed out of the personal learnings, experiences, lessons, socio-political and religious inclinations, and information that an individual possesses, which almost entirely means that no two persons have the same "common sense". This necessarily means that common sense isn't common; in fact the differences in common sense accounts for disagreement, argument, debate, trolling, and hubris in the internet. But this also means that there are great opportunities for learning something "out of the box".
I'll talk about three things that boggled my common sense this week.
Open source legislation
Clay Shirky had this great talk at TED Edinburgh on how people are starting to use the distributed source control site Github, normally used only by software developers on open source software, to introduce a type of "open legislation" and bring transparency in democratic lawmaking to a whole new level.
While "common sense" dictates that doing anything to bring together so many different people with different opinions on how complex structures like software and legislation would be chaotic, inefficient, and downright impossible, this video illustrates how programmers have transcended those barriers. It just might make sense to give it a chance to open source lawmaking and legislation.
I shared the details of Steve Jobs's story because when it comes to finding fulfilling work, the details matter. If a young Steve Jobs had taken his own advice and decided to only pursue work he loved, we would probably find him today as one of the Los Altos Zen Center's most popular teachers. But he didn't follow this simple advice. Apple Computer was decidedly not born out of passion, but instead was the result of a lucky break--a "small-time" scheme that unexpectedly took off. [Read more..]
So Steve Jobs had a bit of luck, sure, but that isn't the complete story. Jobs did become passionate about his work at Apple afterward, and has hence driven it to success at being the most valuable company in the world. But if following your passion isn't the right thing to do, what is? Seth Godin gives this golden tidbit in his book Linchpin:
Maybe you can't make money doing what you love (at least what you love right now). But I bet you can figure out how to love what you do to make money (if you choose wisely).
The best place to spur innovation
In the era of Web startups and billionaire technologists, it's easy to think that the best and most innovative products that we've ever seen have come from people who have quit their jobs, started things from scratch then grew them to something big. Who can blame anyone for thinking that way if Microsoft, Apple, Google, Facebook and Twitter all had that same origin story?
Heart disease is prevalent in India but diagnosis is not, so Medtronic created diagnostic camps to identify potential patients. I saw one camp in a rural village where technicians used low-cost electrocardiogram machines to screen dozens of people in an afternoon and wirelessly send their ECGs to be read by doctors hundreds of miles away. Insurance is still rare in India, so Medtronic had to make its pacemaker more affordable. It worked with a local partner to create India's first financing plan for medical devices.
No new technology was involved here — and that's the point. Medtronic used business model innovation to enter markets formerly out of its reach. [Read more...]
So there are exceptions to common sense, now what? The important thing is to learn to listen. Ideas that challenge our common sense, those that strike at the heart of long believed notions, are important and should be listened to if we want to have a better understanding of how the technology world works, how things change, and how things can be made better.