Saturday, July 26, 2014

An introduction to Git - the history and getting started

At the July Meetup the group met to discuss the merits and use of Distributed Version Control Systems, and, more specifically Git and GitHub.

For software developers who haven't used source-code management (SCM) before it's useful to have some context on how it came about.



The story starts a few decades ago in the 1970s.  In 1973 Ken Thompson and Dennis Ritchie put forward a whitepaper on the subject of the UNIX operating system. The concept was of great interest to The University of California, Berkley.  In 1974 the mathematics and computer science department pooled their resources and purchased a PDP11 to run UNIX.  Their distribution of UNIX arrived on tape.

Ken wasn't involved in the installation of UNIX on the Berkley PDP 11 and the team were having problems with the system, it kept crashing unexpectedly.  Ken was based in Purdue, New Jersey, to facilitate de-bugging the team used a pair of modems to allow Ken to dial-in remotely.  At the time Berkeley only had a 300-baud acoustic-coupled modem without auto-answer capability, so with some transcontinental team co-ordination the team managed to get Ken online to assist with de-bugging.

In 1975 Bill Joy joined the Berkeley team as a graduate student and became involved in the development of their UNIX system.  In 1977 Bill put together the first bundle of software based on the UNIX operating system core, it was called the Berkley Software Distribution.  As the distribution secretary Bill sent out 30 copies of BSD on tape in the first year.

Fast-forward to 1991 when Linux Torvalds was developing Linux.  Key to Linux development was the idea of community based development, a foundation of the open-source movement.  The methods of mailing tapes and coordinating the modem calls of the 1970s would not scale the way it needed to to support Linux development.

At first Linux development was managed by distributing archived files and patches, but it was cumbersome and inaccurate.  Linus was fond of the BitKeeper system developed by Larry McVoy's company.  BitKeeper was developed on similar design principles to an existing product called Sun Workshop Teamware, interestingly developed by Sun Microsystems where Bill Joy went to work from Berkley.  Larry agreed to grant a free-of-charge license to use BitKeeper to support Linux development; however, the Linux community were at odds over Linus' decision.  Members weren't happy that Linux, an open-source project, were now relying on a closed system.

The community's fears were not without merit, in 2005 Larry withdrew the free-of-charge license.  At this point Linus made the decision to develop his own DVCS system and Git was born.  Git was a command line tool that developers could use for source-code management, it was distributed to support the community and most importantly to Linus, it was fast.

A few years later in 2007 Tom Preston-Werner, Chris Wanstrath and PJ Hyet started working on what was to become GitHub an online DVCS based on the principles of Git, the system launched in 2008.  Since 2008 GitHub has become increasingly popular and in a survey carried out by Eclipse in 2014 Git is now used by 33.3% of developers, it is now the dominant DVCS in the market place.

P3P: The Precise 3 Minute Presentation

Originally published on www.themethodology.net
P3P-rev-sm
In discussing how to encourage the members of the VancouverMobile.net community to share with the group, we began to consider what the format should look like. Professionally, people are often confronted with topics that they don't feel a passion for, making it difficult to impart a compelling story and leaving the presenter and audience feeling indifferent to what was shared.   On the other end of the spectrum, often we are so excited by a story that we don't impart the details coherently, missing an opportunity to share something interesting with our audience.

Many meet-ups use popular formats; elevator pitches, lightening pitches, or the longer TEDtalk format.  Each of these formats have their strengths and weaknesses but none of them really seemed to fit. It was time to come up with our own format, and so, the P3P (pronounced pep) the Precise 3 minute (and 33 second) Presentation was born.

The idea of P3P talk is very simple, encourage members of the group to stand up and impart a short story.  The length of time is intentionally short, three minutes and thirty-three seconds, to make it less intimidating and to encourage the story-teller to focus on a few salient points.  Key to the P3P talk is to give the audience enough information to get them interested, enough to want them to follow up with the presenter.

The P3P format is easy to follow; within three minutes and thirty-three seconds, the presenter will impart a few pieces of information. the following are guidelines on what we believe the presenter would want to cover:

1. What's the need?  For the first part of the P3P, the presenter will provide the background on the need that they identified. 
Most fairy-tales will identify a need as something like, 'rescue the princess from the dragon'.  Specifically to technology it might be a gap in the market, such as, 'the ability to send messages cross-platform using data-services not SMS' (WhatsApp).  This is the opportunity for the story teller to grab the audiences attention and set the context for their P3P.

2. What's the solution? This is where the presenter has the opportunity to tell the audience how they met the need. 
Using the fairy-tale metaphor "I used a silver tipped arrow to kill the dragon and rescue the princess" This part of the P3P is optional, sometimes the story-teller might not know what the solution is, they may understand the need but haven't quite figure out how to address it yet, so they may be asking the audience for their ideas and inputs how to go about meeting their identified need.

3. How was it solved?  Sometimes the solution isn't as interesting as the journey, this is the opportunity for the story teller to describe how they managed to go from the need to the solution. 
Using a different metaphor, the Lord of the Rings wouldn't have been have nearly as interesting if the journey had been left out.  This section, like the solution, is also optional.  It's possible that the story teller has an identified need and a solution but doesn't know how to get there so it's a chance to pose the question to the participants to canvas ideas to get from need to solution.

That's it, we are currently trying to decide what happens after the P3P, do we give an opportunity for questions?  It would appear convenient to offer a P3P presenter a slot of five minutes, they can cover their P3P with time for one or two short questions.  Regarding time keeping, we haven't yet been keeping strict time for those presenting at the meet-up, we've had the luxury of time to allow people to run over, but moving forward the intention is to tighten things up.  We're also looking at developing some tools to help P3P presenters prepare, more on that to follow.
written by Nathan Roarty @njr_itarchitect

Monday, July 21, 2014

Why might 2014 be the year of Wearables?

There has been a lot of tech chatter about how 2014 is meant to be the year of the wearable .  At the May Vancouvermobile.net meet-up the group came together to discuss trends and share their experiences of wearable technology.

Wearables is a subset of a much larger trend within industry to connect just about anything to the network.  The wider connected device landscape is what the large IT marketing machines are calling the Internet of Things (IoT), encapsulating the trend to connect more and more devices to the network.  By devices we are referring to a whole host of things that previously did not have a network presence, it started in earnest with smartphones but is increasingly involving just about any electronic device that has a processor and often a sensor connected to it, from; devices in the home such as refrigerators, televisions, heating systems, even light bulbs; to industrial SCADA devices such as meters and manufacturing sensors.  

Recently, at CISCO's 2014 annual technology conference, CISCO's CEO John Chambers opined that globally less than 1% of the devices that could be connected to the network are connected today.  Curious statistics aside, it goes without saying that there is an enormous market out there for devices and connectivity waiting to be tapped by whomever gets there first.

Wearables, as a concept, isn't something new, the idea has been around for quite some time, think of Bluetooth headsets, although not really active on the internet they did align with wearables principles; making use of network technology to provide a service, in this case telephone audio, more 'wearable'.  It is, however, curious to look at where the Bluetooth headset is today, they are certainly not perceived as desirable devices and are most often now seen being used by suits/professionals and taxi drivers.

It does appear that recently there have been a number of key factors that is driving a re-invigorated interest in wearables:
  • Network ubiquity - not just WiFi, but cellular or Bluetooth.  The reach of connectivity is ever reaching and alongside coverage, the bandwidth is increasing, and the price of that connectivity is decreasing.  It is worth noting that the cost of cellular data is a point of much contention, but the increasing ease of accessing bandwidth through free WiFi can be argued to lower the overall costs.
  • More processing power - nobody really argues the validity of Moore's law; however, when you couple the fact that processing power is being offered in smaller, more power efficient, and cheaper processors there is a net increase in the amount of processing power in just about everything.
  • Miniaturization - this idea that things are getting smaller is often overlooked, but,  there is more stuff being squeezed into smaller packages.  Modern processors often include encryption on-board, commodity chips will often include networking capability, less bits of silicon are needed to do more things, it's now easier than it has been to make rich featured little things.
  • Commoditization - to top it off, all of these advancements are now more accessible and cheaper to buy.  It is no longer a case of whether or not you can source a component, today the questions are; how many do you want and how much do you want to pay?

There is an interesting trend that is perhaps most apparent within the wearables space.  That is, the ecosystem that is required to deliver an end-to-end service to a consumer.  Looking at the layers of technology involved in a wearable service there is a a requirement for hardware, software and sometimes back-end analysis.  It is rare to see one single entity providing the end-to-end solution.  The hardware platform may require expensive research and development and prototyping, if an existing platform isn't already in the market.  Those hardware platform providers are increasingly looking to the software developer community for their software innovation; exposing a plethora of APIs for the community to write code to.  For those applications which have a dependence on back-end services there are a few established providers of cloud services that offer easy and affordable cloud services to provide, storage, processing and analytics capability that previously had to be in-house and expensive.  The model is still evolving with a number of parties investing in the success of an end-to-end wearables platform.

Whether or not the wearables market turns out to be the untapped gold mine that some opinion would lead you to believe, there are increasing numbers of devices on the network how many of those devices end up being wearables remains to be seen.