Thursday, September 4, 2014

How to add an iOS Image Set to an Asset Catalog using Xamarin Studio

 

With the official launch of iOS 8 just around corner, I thought it was time to get up to speed on on Asset Catalogs, and with that, Image Sets.

Image Sets allow you to organize versions of your images in image sets, and then add these sets to an Asset Catalog.  This will allow you to create multiple versions of the same image and have iOS automatically load the create version at runtime depending on the device.

Traditionally this would be used for providing regular and retina assets, as well as adding different versions depending on the idiom (ie iPhone / iPad).  And Apple has made it clear this year at WWDC 2014 that this is going to be much more powerful when using Universal Story boards and storing different versions of the image based on traits (but that’s a whole other conversation).

So let’s see how we go about creating an Image Set in Xamarin Studio.  The first thing we need to do is create an Asset Catalog:

Right click on your project and select “Add” –> “New File” and then select “Asset Catalog”.

NewAssetCatalog

Give it a name and click the “New” button.

By default the asset catalog includes Image Sets for application icons and launch images.  And this is a great way to keep them organized, but if you’re adding your application assets manually you can delete these two sets.

Next, right click on the Asset Catalog and select “Add” –> “New Image Set”

AddImageSet

Now just click on the content.json file and drag files from finder to the appropriate location.   Currently you can set universal assets as well as adding device specific assets (iPhone / iPad) and I’d expect to see further support for traits when iOS 8 is officially released.

AddingAssets

Tuesday, September 2, 2014

A Security Perspective on the Cross-platform Mobile Space

The August Vancouvermobile.net meetup focused on mobile application security and covered some basics tips on how developers might go about securing data within their applications.  Specifically looking at a couple of encryption methods to secure data from prying eyes.

In preparing for the meetup I did some research on the security of the mobile platform to understand the scope of the problem and where the vulnerabilities lie.  There are a number of reports published by well respected technology companies that review this area, a handful that were reviewed are:

The statistics and positions referred to in this post borrow from these reports.

The discussion starts from a fundamental premise – data has value.  Data can be viewed as the online currency, with different types of data having different degrees of value.  Not surprisingly the greater the value of the data the greater the risk of malicious actors trying to attack it.  In trying to generalize the value of different data types, here is an interpretation of three types of data-value classification:

  1. Personal Data – this is data that identifies an individual; however, having this data does not necessarily enable a malicious actor to do any harm.  An example of personal data might be e-mail addresses, telephone numbers or even a home address.   Things get interesting when you combine several pieces of personal data, the correlation of this data might lead to information that is more sensitive.  Personal data may be subject to regulatory control depending on the geography that you are in.  For example here in Canada there is the Personal Information Protection and Electronic Documents Act (PIPEDA) .  The act establishes rules for the management of personal information by organizations involved in commercial activities.
  2. Sensitive Data – this is data that an individual would want to keep private.  Examples of sensitive data might be credit card numbers, social insurance numbers, passwords or PIN numbers.  This type of data is likely to be subject to local legislation such as PIPEDA but may also have specific additional regulatory legislation in place.  For example PCI compliance is required when dealing with financial data.
  3. Business Data – this post is really talking about personal and sensitive information but it is important to remember that a device may be used for business purposes and business data has a different type of value.  The presence of business data may increase the risk of a mobile device being targeted.

This establishes a foundation where there is an understanding that data on a mobile device has value, not only to the user who will want it protected but to malicious actors who may want to gain from stealing that data.  Malicious actors who want to obtain personal and sensitive data for financial gain often use tools that fall under the generic heading of crimeware.  This term is used to define a sub-set of malware which is used to facilitate cybercrime, such as identify theft.

One of the most popular crimeware platforms highlighted by the Verizon report is Zitmo (Zeus in the Mobile).  Zitmo is a variant of the Zeus malware targeted specifically at mobile platforms.

There are a few themes that run through all of the reports that highlight the importance for software developers to secure their apps to protect data.
  • The mobile platform is seeing increased interest from malicious actors as a platform worth attacking.  Although the amount of malware targeting mobile devices specifically is still small the trend is showing that this is rapidly growing.
  • Android currently appears to be the largest target for malware, CISCO reports that Android users make up 71% of the encounters with web-delivered malware with iOS coming in second place at 14%
  • Physical theft of mobile devices still presents a real threat.  The easiest way to reduce the risk of data theft from a stolen device is to use encryption on mobile devices.
  • To be most effective security should be considered from the start of a development not as an afterthought to be layered into a product after completion.
  • Mobile devices only make up one aspect of a service, security needs to be considered within the device, across the network and into the cloud service that may be supporting the mobile application.

What can mobile app developers do?  There were a few things that came out of the meet-up, this is by no means a comprehensive list, but a summary of the topics discussed within the time we had available on the night:
  • Consider the types of data being used within your mobile app:
    • What is the value of that data to the user?
    • How should the data be appropriately secured?
  • Use the right method to secure data depending on the data type and it's sensitivity:  
    • Use encryption where it is necessary to store and retrieve data and consider the right type of encryption for the type of data being stored.  
    • Use hashing for data such as passwords and PIN numbers where there is no requirement to de-crypt the data in the future.
  • Consider the use of shared services, like OAuth, so that sensitive data need not be stored within your app.

To finish off, there is a great quote that puts a good perspective on IT security:
"Security in IT is like locking your house or car - it doesn't stop the bad guys, but if it's good enough they may move on to an easier target." - Paul Herbka

Saturday, July 26, 2014

An introduction to Git - the history and getting started

At the July Meetup the group met to discuss the merits and use of Distributed Version Control Systems, and, more specifically Git and GitHub.

For software developers who haven't used source-code management (SCM) before it's useful to have some context on how it came about.



The story starts a few decades ago in the 1970s.  In 1973 Ken Thompson and Dennis Ritchie put forward a whitepaper on the subject of the UNIX operating system. The concept was of great interest to The University of California, Berkley.  In 1974 the mathematics and computer science department pooled their resources and purchased a PDP11 to run UNIX.  Their distribution of UNIX arrived on tape.

Ken wasn't involved in the installation of UNIX on the Berkley PDP 11 and the team were having problems with the system, it kept crashing unexpectedly.  Ken was based in Purdue, New Jersey, to facilitate de-bugging the team used a pair of modems to allow Ken to dial-in remotely.  At the time Berkeley only had a 300-baud acoustic-coupled modem without auto-answer capability, so with some transcontinental team co-ordination the team managed to get Ken online to assist with de-bugging.

In 1975 Bill Joy joined the Berkeley team as a graduate student and became involved in the development of their UNIX system.  In 1977 Bill put together the first bundle of software based on the UNIX operating system core, it was called the Berkley Software Distribution.  As the distribution secretary Bill sent out 30 copies of BSD on tape in the first year.

Fast-forward to 1991 when Linux Torvalds was developing Linux.  Key to Linux development was the idea of community based development, a foundation of the open-source movement.  The methods of mailing tapes and coordinating the modem calls of the 1970s would not scale the way it needed to to support Linux development.

At first Linux development was managed by distributing archived files and patches, but it was cumbersome and inaccurate.  Linus was fond of the BitKeeper system developed by Larry McVoy's company.  BitKeeper was developed on similar design principles to an existing product called Sun Workshop Teamware, interestingly developed by Sun Microsystems where Bill Joy went to work from Berkley.  Larry agreed to grant a free-of-charge license to use BitKeeper to support Linux development; however, the Linux community were at odds over Linus' decision.  Members weren't happy that Linux, an open-source project, were now relying on a closed system.

The community's fears were not without merit, in 2005 Larry withdrew the free-of-charge license.  At this point Linus made the decision to develop his own DVCS system and Git was born.  Git was a command line tool that developers could use for source-code management, it was distributed to support the community and most importantly to Linus, it was fast.

A few years later in 2007 Tom Preston-Werner, Chris Wanstrath and PJ Hyet started working on what was to become GitHub an online DVCS based on the principles of Git, the system launched in 2008.  Since 2008 GitHub has become increasingly popular and in a survey carried out by Eclipse in 2014 Git is now used by 33.3% of developers, it is now the dominant DVCS in the market place.

P3P: The Precise 3 Minute Presentation

Originally published on www.themethodology.net
P3P-rev-sm
In discussing how to encourage the members of the VancouverMobile.net community to share with the group, we began to consider what the format should look like. Professionally, people are often confronted with topics that they don't feel a passion for, making it difficult to impart a compelling story and leaving the presenter and audience feeling indifferent to what was shared.   On the other end of the spectrum, often we are so excited by a story that we don't impart the details coherently, missing an opportunity to share something interesting with our audience.

Many meet-ups use popular formats; elevator pitches, lightening pitches, or the longer TEDtalk format.  Each of these formats have their strengths and weaknesses but none of them really seemed to fit. It was time to come up with our own format, and so, the P3P (pronounced pep) the Precise 3 minute (and 33 second) Presentation was born.

The idea of P3P talk is very simple, encourage members of the group to stand up and impart a short story.  The length of time is intentionally short, three minutes and thirty-three seconds, to make it less intimidating and to encourage the story-teller to focus on a few salient points.  Key to the P3P talk is to give the audience enough information to get them interested, enough to want them to follow up with the presenter.

The P3P format is easy to follow; within three minutes and thirty-three seconds, the presenter will impart a few pieces of information. the following are guidelines on what we believe the presenter would want to cover:

1. What's the need?  For the first part of the P3P, the presenter will provide the background on the need that they identified. 
Most fairy-tales will identify a need as something like, 'rescue the princess from the dragon'.  Specifically to technology it might be a gap in the market, such as, 'the ability to send messages cross-platform using data-services not SMS' (WhatsApp).  This is the opportunity for the story teller to grab the audiences attention and set the context for their P3P.

2. What's the solution? This is where the presenter has the opportunity to tell the audience how they met the need. 
Using the fairy-tale metaphor "I used a silver tipped arrow to kill the dragon and rescue the princess" This part of the P3P is optional, sometimes the story-teller might not know what the solution is, they may understand the need but haven't quite figure out how to address it yet, so they may be asking the audience for their ideas and inputs how to go about meeting their identified need.

3. How was it solved?  Sometimes the solution isn't as interesting as the journey, this is the opportunity for the story teller to describe how they managed to go from the need to the solution. 
Using a different metaphor, the Lord of the Rings wouldn't have been have nearly as interesting if the journey had been left out.  This section, like the solution, is also optional.  It's possible that the story teller has an identified need and a solution but doesn't know how to get there so it's a chance to pose the question to the participants to canvas ideas to get from need to solution.

That's it, we are currently trying to decide what happens after the P3P, do we give an opportunity for questions?  It would appear convenient to offer a P3P presenter a slot of five minutes, they can cover their P3P with time for one or two short questions.  Regarding time keeping, we haven't yet been keeping strict time for those presenting at the meet-up, we've had the luxury of time to allow people to run over, but moving forward the intention is to tighten things up.  We're also looking at developing some tools to help P3P presenters prepare, more on that to follow.
written by Nathan Roarty @njr_itarchitect

Monday, July 21, 2014

Why might 2014 be the year of Wearables?

There has been a lot of tech chatter about how 2014 is meant to be the year of the wearable .  At the May Vancouvermobile.net meet-up the group came together to discuss trends and share their experiences of wearable technology.

Wearables is a subset of a much larger trend within industry to connect just about anything to the network.  The wider connected device landscape is what the large IT marketing machines are calling the Internet of Things (IoT), encapsulating the trend to connect more and more devices to the network.  By devices we are referring to a whole host of things that previously did not have a network presence, it started in earnest with smartphones but is increasingly involving just about any electronic device that has a processor and often a sensor connected to it, from; devices in the home such as refrigerators, televisions, heating systems, even light bulbs; to industrial SCADA devices such as meters and manufacturing sensors.  

Recently, at CISCO's 2014 annual technology conference, CISCO's CEO John Chambers opined that globally less than 1% of the devices that could be connected to the network are connected today.  Curious statistics aside, it goes without saying that there is an enormous market out there for devices and connectivity waiting to be tapped by whomever gets there first.

Wearables, as a concept, isn't something new, the idea has been around for quite some time, think of Bluetooth headsets, although not really active on the internet they did align with wearables principles; making use of network technology to provide a service, in this case telephone audio, more 'wearable'.  It is, however, curious to look at where the Bluetooth headset is today, they are certainly not perceived as desirable devices and are most often now seen being used by suits/professionals and taxi drivers.

It does appear that recently there have been a number of key factors that is driving a re-invigorated interest in wearables:
  • Network ubiquity - not just WiFi, but cellular or Bluetooth.  The reach of connectivity is ever reaching and alongside coverage, the bandwidth is increasing, and the price of that connectivity is decreasing.  It is worth noting that the cost of cellular data is a point of much contention, but the increasing ease of accessing bandwidth through free WiFi can be argued to lower the overall costs.
  • More processing power - nobody really argues the validity of Moore's law; however, when you couple the fact that processing power is being offered in smaller, more power efficient, and cheaper processors there is a net increase in the amount of processing power in just about everything.
  • Miniaturization - this idea that things are getting smaller is often overlooked, but,  there is more stuff being squeezed into smaller packages.  Modern processors often include encryption on-board, commodity chips will often include networking capability, less bits of silicon are needed to do more things, it's now easier than it has been to make rich featured little things.
  • Commoditization - to top it off, all of these advancements are now more accessible and cheaper to buy.  It is no longer a case of whether or not you can source a component, today the questions are; how many do you want and how much do you want to pay?

There is an interesting trend that is perhaps most apparent within the wearables space.  That is, the ecosystem that is required to deliver an end-to-end service to a consumer.  Looking at the layers of technology involved in a wearable service there is a a requirement for hardware, software and sometimes back-end analysis.  It is rare to see one single entity providing the end-to-end solution.  The hardware platform may require expensive research and development and prototyping, if an existing platform isn't already in the market.  Those hardware platform providers are increasingly looking to the software developer community for their software innovation; exposing a plethora of APIs for the community to write code to.  For those applications which have a dependence on back-end services there are a few established providers of cloud services that offer easy and affordable cloud services to provide, storage, processing and analytics capability that previously had to be in-house and expensive.  The model is still evolving with a number of parties investing in the success of an end-to-end wearables platform.

Whether or not the wearables market turns out to be the untapped gold mine that some opinion would lead you to believe, there are increasing numbers of devices on the network how many of those devices end up being wearables remains to be seen.