Optimised code is something every programmer wants to write. We want our code to run as fast as possible using as few resources as possible. Unfortunately optimisation can often seem quite daunting. From needing to deal with parallelism, to knowing how to use profiling tools, to potentially even dropping down to assembly and hand crafting each instruction. Thankfully, most optimisation needn't be this complicated. Indeed, much like debugging, a lot of it can be solved with a few well placed log statements and a bit of thought about the problem you're trying to solve.
Recently I've been working through a simple problem, first implementing a naive solution, then proceeding to optimise it. I thought I'd share how I approached it in the hope that it may help others, or at the very least give an interesting read. Thankfully the problem can be talked about in abstract terms using images. There will be a bit of pseudo code where needing but for the most part this is intended to show how I optimised the algorithm without having to think purely in terms of code.
If you're at all familiar with Betteridge's law you'll know the answer to the title of this post already, but I want to discuss the reasoning why. Ever since Swift came out I've seen a lot of people who've jumped on the bandwagon proclaim that Objective-C is dead and anyone who isn't already investing in Swift is stuck in the past and will be left far behind. I've even seen some people proclaiming the past few days that developers who aren't using Swift now aren't worth hiring. Needless to say I think this is bullshit.
The mouse in the photo below has been my trusty companion for the best part of 9 years now. It, paired with the Microsoft Natural Ergonomic 4000, have been my main input devices throughout most of my programming career. It features 5 buttons, a rocker on the side (which could work as 3 more buttons) and a high speed scroll wheel, all of which can be re-configured to do a multitude of tasks using Logitech's driver software. It is the Logitech MX Revolution and it is by far the best mouse ever produced.
Motivation is a funny thing. It’s wonderful when you have it. It will lead you on and cause you to start projects. It keeps you going even when it may seem logical to stop. But it can just as easily abandon you, leaving you with a project (possibly several) that you feel compelled to continue simply because you’ve put so much effort in already.
Like most people, upon hearing of Swift, I rushed to download Xcode and the programming guide to try it out. 48 hours later I decided to leave it. I’ve tried it on and off since then and, while it has seen improvements, my overall opinion of Swift has stayed the same: it’s not yet fit for purpose.
When you spend all day working on a computer one of the most important things you can do is make sure you have good input devices, be that a mouse, a graphics tablet or, in this case, a keyboard. When you buy a Mac you usually get a Keyboard and Mouse (or Trackpad) included. Unfortunately, Apple haven't made a mouse even approaching decent since the ADB Mouse II. Their keyboards are pretty good, but for someone who is typing all day they have some flaws, the key one being that they aren't particularly ergonomic.
Autolayout is an incredibly powerful API that allows us to build complex and flexible UIs with minimal code. Any layout you can imagine can be defined in Autolayout. However, that is like saying that any program you can imagine can be defined in a Turing complete language. It says nothing to the simplicity or intuitiveness of the solution. So why are some layouts so awkward to produce in Autolayout?
Storyboards seem to be a big point of contention in iOS development. Some see them as wonderful additions, some as a poorly designed and pointless hindrance that Apple seems intent on force feeding us. There is one thing that’s consistent though: almost nobody is using them right.
That’s a bold statement to make. It's based on the many conversations I've had with people, and the many tweets and blog posts on the issue. One of the key things I see is a rather innocuous question: NIBs or Storyboards? This highlights a massive misunderstanding of what Storyboards offer us, as it pits them as a replacement to NIBs. In reality they can be easily used to complement NIBs.
In this post I want to show how I’ve been using Storyboards and NIBs, together with a few ideas I’ve been throwing around for structuring apps. It may not represent the best way of doing things, there are even some parts I’m still unsure of myself. Hopefully it will give you some insight in how to get more out of Storyboards and NIBs, or at least provide a starting point for debate.
There was a post by Florian Kugler going round recently about Autolayout Performance on iOS. It looked at how much time it takes Autolayout to add views, and how this increases with the number of views. The post, while providing very useful information, didn't seem to best represent real world performance of Autolayout, instead showing a set of worst-case scenarios.
In this post, I want to look more at why Florian got the results he did. My hope is to highlight some bad practices one can have with Autolayout, and look at what makes both the statements that "Autolayout takes several seconds to layout a few 100 views" and "Autolayout can layout a few 100 views very quickly" true, despite their seemingly contradictory nature.
Autolayout has had a lot of bad press. A lot of people find it complex, confusing and more hassle than it's worth. They find the APIs a bit awkward to work with and the tools provided seem to work against them and break what they've done. I'm wanting to change that, so I'm working on various projects to help people learn and use Autolayout.
Yesterday the first
clusterfuck elections were held for Police & Crime Commissioners. These are meant to be elected officials that oversee policing and crime prevention in 41 areas (excluding London). The turnout for these elections has been laughably low, ranging from 10% to just below 19%. These are the lowest peacetime election turnouts in history. Evidently many chose not to vote, me included. But why was that? I can't vouch for everyone but I can at least give my reasons.
One of the primary reasons is that we're in an age of austerity, yet we're somehow finding £100 million to throw away on an election that nobody really knows or cares about. Instead of spending so much money on a new election, why not spend it on… I don't know, more police?
8:30am: Hello and welcome to this coverage of Apple's October 12th Maps press conference. Apple called this conference in response to the uproar over their new Maps applications.
8:40am: We're noticing a lot of people filing in, dressed like tourists. I overheard one of them asking whether this was Buckingham Palace, and pointing to their iPhone claiming Siri sent them here.
9:00am: The music has stopped, Tim Cook is taking to the stage
Emacs or Vim? Tabs or Spaces? Mac or PC? There are many arguments between developers. One of the biggest ones in the Objective-C community is over dot-syntax. It's the argument that just keeps going. I'm on the side that doesn't particularly like dot syntax, but people often misunderstand the reasoning behind this position. So I'm going to outline it here.
A new year and a new version of Xcode 4. This of course means that it's time for me to drop everything and try to find out everything that's changed. Thankfully (for me at least), this is a relatively small update feature wise. I've been seeing two different responses to 4.3. Half of people seem to be experiencing a lot of crashes, but the other half seem to be seeing major performance improvements. I will be honest in admitting that I haven't really noticed either, but then again I'm still sure I have a special "more stable" build of Xcode that few others have. But besides that, what else has changed?
I'll start off by saying that I'm not suggesting you should not file bugs with Apple, you should. I'm also not going to suggest that Radar isn't a valuable tool to Apple, it is. But it's time to get something off my chest, something I've been wanting to rant about for ages.
I fucking hate Radar and everything related to filing bugs with Apple.
I cannot begin to explain how awful it is. First of all the radar software itself is a really dated web app. It's hard to find radars you've already filed and takes a lot of clicks to do anything useful at all. This alone really puts you off wanting to bother as it is so much effort to file even the most basic of radars.
I've spent a lot of time and words this year talking about what Xcode does have. At the same time I've spent quite a few tweets pining for certain features or wishing pain on those who are responsible for annoying bugs. I thought it would be an interesting idea to put together my personal wish list for the future of Xcode. These aren't in any particular order, though I have marked which are bugs I'd love to see fixed and which are features I'd like to see added. And finally I've included radar numbers where appropriate to allow you to file duplicates if you so desire.
[UPDATE 8/2/13]: I've struck through the titles of those items that have since been fixed
The great Apple software release of 2011 happened a few weeks ago, bringing the likes of iOS 5 and iCloud. But we don't really care about those in this post, what we care about is Xcode 4.2. If Xcode 4.1 was the Lion release, Xcode 4.2 is the iOS 5 release (although many of the improvements apply to the Mac as well). So lets get cracking and see what's new and improved:
Lots of people have been writing posts describing what Steve Jobs's passing and life meant to them and how he influenced them. I've been struggling to figure out how to put my thoughts together and whether I really wanted to, as others have said most of what I wanted far more brilliantly than I could. However, someone who generally pisses me off, pissed me off to an even greater degree than usual. Said someone is Richard Stallman, or as I shall refer to him henceforth, Dick (as suggested by Justin Williams). The reason for this will soon become apparent.
For those of you who don't know about Dick, well he's pretty much a guy who fights vigorously for open source in the name of "freedom", dismissing anyone who doesn't fit into his myopic world view as the enemy and generally being arrogant and pig headed. I'll leave it up to you to make the obvious connection to certain other political movements and leaders.
Apple is no stranger to throwing out the old and changing things around. You can build something incredible, that makes lots of money and makes many people happy, but if you let that stagnate then people start to get restless. It's how the new kid on the block comes along and steals your thunder. So you have to change. People will get annoyed and even angry, as human beings are generally averse to big changes, at least to begin with.
But change is important, change is what makes you survive. There is a wonderful quote by Darwin about natural selection:
It is not the strongest of the species that survives, nor the most intelligent that survives. It is the one that is the most adaptable to change.
Prior to Xcode 4 there existed the Build folder. I hated this folder, especially as I often wanted to zip up a project and send it to someone, and I'd end up having a huge file because I'd forgotten to delete the Build folder. So imagine my delight when Apple created the Derived Data directory in Xcode 4. This directory contains all the build products, intermediate files, indexes, logs etc. It is usually located in ~/Library/Developer/Xcode/DerivedData. The problem is, it is rather hard to find the particular Derived Data directory for a particular project…
Lion is a great OS. It has brought many great user features (Versions, autosave etc) and many fantastic developer features (Autolayout, popovers etc). It has brought one new feature though that is annoying a lot of users: Mission Control. I've mentioned some complaints I've got about it on twitter, but it's got to the point where I really need a blog post to convey my full opinion on it. I can't quite remember another change in an OS X update that was so big, yet so awful.
For those who don't know, Mission Control is a sort of replacement for Exposé and Spaces that tries to combine the two, along with full screen apps, into one central location for managing windows. It sounds good in theory, but as we'll see the reality isn't great.
Another Xcode release, another review of what's new. Xcode 4.1 coincides with the release of Lion and includes many improvements, some to help adopt new technologies in Lion and some just to make Xcode a better IDE. So lets get started with what's new.
So, I find myself here again, talking about the powder keg topic that is equality. My last post caused lots of argument on twitter. Things seemed to have calmed down and everyone had gone away to reflect on things. Today Faruk, who's initial post prompted my last post on the subject, posted a sort of rebuttal to my post and another person's post. It is a much more reasonable and well articulated post, but I still disagree with several points. Now what I ultimately found through the arguments on Twitter is that Faruk and I both want the same goal, but we disagree somewhat on the means by which to achieve it.
First off, I want to outline the three key points I'm going to make, just for those who don't like lengthy posts:
Those are my 3 key points. You may agree or disagree, with them already, but I hope you'll read of the rest of the post to understand why I make those points.
Over the past couple of years there has been a large influx of Objective-C developers. Some are coming from dynamic languages like Ruby or Python, some from strongly typed languages like Java or C#, and of course there are those who are new to programming altogether. But this means that a large number of Objective-C developers haven't been using it for all that long. When you're new to a language, any language, you focus more on fundamentals like the syntax and core features. But it is often the more advanced, and sometimes less well used parts of a language that really makes it shine.
In this post I'm going to give a bit of a whirlwind tour of the Objective-C runtime, explaining what makes Objective-C so dynamic, and then go into various techniques that this dynamicness enables. Hopefully this will give you a better understanding of how and why Objective-C and Cocoa work the way they do.
So if you're in the tech world you may have heard that several small iOS developers have been sent legal papers by what is commonly known as patent troll, asking for money for a patent that apparently covers In App Purchasing. There are lots of questions, people are angry and as usual patents are being berated.
I often like reading "translation" posts, where someone takes what someone said and puts it into the terms everyone was thinking. I also like reading posts regarding the problem of equality and representation of people at conference and the lack of people who aren't straight, white men in the community, and how we can possibly solve it. I also have a lot of respect for Faruk Ates (@KuraFire on Twitter). So it was a dissapointment this morning to read a "translation" style post by Faruk this morning on the topic of how to increase the participation of minorities at tech and design conferences, which I almost entirely disagreed with and considered crass and unhelpful.
For those who can't be bothered clicking the link (though I strongly recommend you do), here's a brief overview. Apparently a big discussion took place last night on Twitter between Mike Monteiro and some others. Apparently Mike stated that conferences that have only white, male speakers is unacceptable and they MUST have female and black speakers. There was some back and forth on Twitter and then some people wrote some blog posts. Faruk took it upon himself to "translate" these posts, but sadly he seemed to completely miss the mark on many of the quotes.
NB: Some of the thoughts I expressed in this post have since changed, but I'm leaving it up in its entirety if only so I can occasional look back and see how my thinking has changed since I wrote it.
We all hear about how we need to make our applications more concurrent. There is no longer a free lunch for software developers, where processor cores will get faster, giving us a performance boosts for free. Instead we need to try and run more code in parallel.
Unfortunately we also hear about how hard concurrency is. It isn't something you can just throw in easily. Concurrency is hard to conceptualise and incredibly hard to debug given the time-sensitive nature of many bugs. So you have to deal with locks to try and prevent things like race conditions. Basically it's a scary ball of complexity that keeps most developers away.
Thankfully, it doesn't have to be quite so scary. If you look at concurrency you realise that it isn't a multitude of problems but one problem that causes all the hurt: data mutability. And by thinking about concurrency in a new way you can simplify and to a large degree eliminate these problems, removing a lot of the need for things like locks in your own code, which can cause performance issues.
Schemes are one of the most interesting new things in Xcode 4, but also one of the hardest to get your head around at first. This guide will help you understand what schemes are and why they are useful.
It can be hard to find your way around Xcode 4 at first, especially coming from Xcode 3. This is a quick start guide for how to find various Xcode 3 items in Xcode 4.
So it is finally here. Xcode 4 has been released into the world and we are now allowed to talk about it. As my review of Xcode 3.2 went down really well I thought I would have a go at reviewing Xcode 4 in depth. I'll also be publishing other posts over the next few days going in to some of the bigger changes since 3.2 in more detail and hopefully helping you migrate. I've also put in radar numbers for all bugs and feature requests, so you can file duplicates or so any of the Xcode dev team reading this can find them. So without further ado, what is new in Xcode 4?
This is a blog post I've had on my "to write" list for a while. At the end of December Buzz Andersen posted a link on twitter that outlined why he doesn't feel comfortable with Pair Programming or the Agile ideal. A conversation followed between several developers, including myself, which was nicely archived by Manton Reece using his Tweet Library app. I recommend reading through the archive to get an idea of what was said.
Now a few weeks later when Manton posted the archive a thought occurred to me. There is a lot of stuff out there which developers need to look at. Lots of concepts, lots of patterns, lots of tools. With this massive jumble of stuff it can be quite easy to lose sight of what might be the core principles.
There are very few absolutes in programming, very few things that, if you don't do them, then eventually things will blow up. These are the core principles of development. There are lots of things out there that claim to be core principles, but you can develop good software without them, and they are usually just one way of using the core principles. The problem is that it is hard to figure out what these core principles are. I don't know what they all are, I doubt anyone does. But you can tell a core principle when you see it. I'm going to go through two examples of things that some consider core and what about them is actually core.
I've just finished watching David Heinemeier Hansson's keynote talk at RubyConf. It's a great talk about what makes Ruby great in his eyes and why he hasn't bothered learning other languages since discovering Ruby. There are some things that I disagree with or that contradict each other (at one point he says Ruby protects you from pointer arithmetic, but later that he likes Ruby because it doesn't stop you from doing things even though they can be dangerous if used incorrectly) but on the whole it's well worth watching.
Except for one point. It is about a sentiment that, at least to me, is shared a lot in the Ruby community (and also to a degree in the scripting language community in general). To quote DHH from the talk: "The programming equivalent of having your balls fondled when you go to the airport is… type safety". Now this post isn't going to be about type safety as such but about something that encompasses type safety, which as the title of the post suggests is: being explicit.
So I read through the slides of a talk recently, by the incredibly talented Anna Debenham. The talk is about the state of web education in schools. It is an incredibly good talk and thankfully there is a video of another talk by Anna that covers pretty much the same things here, which I highly recommend you watch. The thing is, what Anna says about web education equally applies to any form of software design and development. The education system teaches it badly, all the way from primary school through to university. They are either too theoretical or too outdated.
I've long thought about building the ideal software development training course. The problem is there is no good course that I know of that makes good developers. There are courses out there that teach you software engineering or computer science, but they output very few good developers. There are also many courses out there that teach you how to program in certain languages.
The issue is, a lot of these courses are either too theoretical (computer science) or too tied to a certain languages/toolset (pretty much all of them). You end up with lots of C programmers or Java programmers or C# programmers but very few software developers. Now, there is nothing wrong with computer science. It teaches people how to be computer scientists. The issue is that a lot of people who take computer science are wanting to become software developers, not computer scientists.
There has been a lot of anger over the Government's decision to increase the cap on university tuition fees to £6000, with £9000 allowed in some circumstances. There is also anger over the increase in interest rates. However, there has been little proper analysis into how this works out financially. After all, these aren't real loans.
So what is the situation for students today? You pay £3290 a year for tuition (this actually increases in line with inflation) and get roughly £3500 or so in maintenance loan (this is a rough guess based on what I got at university). You start paying interest on this loan at a rate defined by the rate of inflation at a certain point in the year. So already it is better than any other sort of loan. But where it is really different is you pay back a percentage of your income over a certain limit, which at the moment is 9% of income over £15,000. And if after 25 years you have any remaining balance, the amount is written off.
So what will change in the future? Well the fees will be around £6000-9000, the interest rate will be tapered, from 0% on incomes below £21,000 to 3% + inflation for incomes above £41,000, the pay back threshold will be raised to £21,000 and the payback limit will be increased to 30 years.
Now at first glance that seems pretty bad. Fees increase by 2-3x, a potentially large increase interest rates and it's 5 years longer before it is written off. But the key thing that is being missed is the £6,000 increase in the pay back threshold and that interest rates are progressive. This is what helps make it a more progressive system than before.
So… that Mac App Store thing. I've been wanting to write up my thoughts on it for a while but just haven't found the time or drive to. Then I read this post by Marco Arment. It's an interesting read, but is from the perspective of an iOS developer and I can't say I agree with much of it.
Ultimately, there are two extremes of are seeing how the App Store panning out:
1. Much more exposure for Mac apps with huge increases in sales, but prices won't change
2. We'll see prices drop to iOS levels and possibly also see the kind of apps that brings
I think both of these are actually wrong to a relatively large degree. I'll start with number 2.
It's no secret that I'm not a big fan of Ruby's syntax. I've grown to love Ruby's functionality and I would like to see language support for some of it in Objective-C (symbols, non-alphanumeric method names and modules/mixins come to mind). I'm also not too opposed to the syntax of Ruby per say. It is concise and fairly readable.
The issue I have is the huge amounts of flexibility there is in the syntax. A bit of flexibility is nice, especially for conciseness. I wouldn't mind the ability to define arrays, dictionaries, sets and numbers in a shorter syntax in Objective-C. The issue I have is where flexibility causes ambiguity, especially when something is completely identical. This is why I have an issue with the dot syntax in Objective-C, as it looks identical to struct access and adds ambiguity.
The title of this post is a little misleading. I'm not aiming to define discrimination, that has been done already. What I'm wanting to do is put down my definitions of what should and should not be illegal to discriminate against. This isn't to say what is right or what is wrong, there are some forms of discrimination I disagree with but think shouldn't be illegal.
I've been wanting to write this article ever since I heard the response to the incident in the UK back in March, where a Christian couple who ran a B&B turned away a gay couple. For those who haven't heard about it, or forgot, you can read about it here: http://news.bbc.co.uk/1/hi/england/8578787.stm. The whole story brought up an interesting question though: if turning away someone because they are gay is discrimination, then surely forcing a christian couple to take in a gay couple is discrimination against that couple?
I put my old iMac for sale on eBay yesterday. A few hours after it going on sale, I saw that someone had bought it at the "Buy Now" price. I got a bit suspicious as they had only signed up that day and had no feedback. My suspicious were confirmed this morning when I received emails from eBay saying that the buyer had left the site. All credit to eBay, I was able to get through to someone on the phone within a 90 seconds of calling (despite them saying there was a large number of calls) and get my fees refunded so I could re-list it.
Anyway, I checked my spam folder on a whim a little while ago and found some emails from the fake buyer. Turns out they had sent money to a PayPal account (that no longer exists) and were wanting me to ship the iMac to Nigeria. I could have just left it, but I thought "why not have a little fun". So here is my reply:
There is a misconception that in order to get the effect of the retina display on the iPhone you need a display that is 326 dpi (dots per inch), regardless of the device. Now retina display is a marketing term of Apple's, but for the purposes of this blog post I'm going to define a "retina display" as so:
A display where a person with 20/20 vision is unable to tell apart individual pixels at a normal viewing distance
Now these are important for calculating what defines a retina display for various devices. All a retina display is doing is taking advantage of the limited resolution of the human eye, much in the same way that a film takes advantage of the limited frame rate of the human eye. But whereas with motion, the human eye can only see roughly 24 frames a second no matter what, with motion its limit is a function of the distance to the object and the quality of the eye. As such, if you put an iPhone 4 right up to your eye, you can still make out individual pixels.
For a good explanation of all of this, and for where I got the basis of my calculations, I highly recommend reading this blog post.
My health hasn't been all that good the past 3 years. Up until 4-5 months ago I was suffering from a mental illness. And now I have a physical illness which is technically even worse than what I had before. Oddly enough though I've been quite open about the physical illness and fairly quiet about the mental illness. This blog post will change all that and give details about them both, what they are, how they affect me and how you can get help if you need it
OK, let's be frank. 32 bit is dead on the Mac. Apple cut out most of their 32 bit users when they dropped PPC support in Snow Leopard. Those that remain are 3-4 year old machines, so are likely to be replaced in the next 12 months given the 3-5 year upgrade cycle most people have. 32 bit is a legacy platform and should be treated as such.
Given that the last 32 bit machines were sold only a year after the last PPC machines, I would not at all be surprised if OS X 10.7 dropped 32 bit completely. Like PPC it is unnecessary cruft when the future (and several years of the past) is entirely 64 bit. Given this, I'm also phasing out 32 bit support for M Cubed's apps as I move them to be 10.6 only. The first of these will be Lighthouse Keeper 1.2.
So yet again I saw a tweet about the impending death of the Mac in favour of the iPad and yet again I feel the need to blog my answer rather than have 10 conversations about it on Twitter. Here is the tweet:
RT @joehewitt: You'll know the Mac is officially dead when Apple releases Xcode and Final Cut Pro for iPhone OS. <- +1, we're on that path @stroughtonsmith
(NB: From the 3 posts so far on this blog (including this one), you'd assume I have something against Joe Hewitt, given that two of them are arguments against things he's said. I respect Joe, but I disagree with him a lot about his views on the future of the iPad and Mac.)
OK, so I'll flat out state that Xcode and Final Cut Pro will not make it onto an iPad (nor will Photoshop, Word etc) without being either less powerful or less productive. This isn't a case of what the SDK is capable of (though Apple would need massive exemptions to App Store policies for an Xcode iPad app) or what the iPad hardware is capable of. It is simply a matter of user experience.
These are very large and very powerful applications. They do a lot of stuff. To some degree some don't do enough (I'm looking at you Xcode). These applications just aren't well suited to the iPad.
If the Mac market is going to shrink to the size of the current market for Mac Pros, perhaps that will be the only model they keep alive?
I don't agree with these tidings of doom for the Mac. The fact is, a desktop computer is still the best job for many tasks, and a touchscreen tablet wont replace them unless it becomes a desktop computer. The iPad form factor/input mechanism is inherently flawed for certain tasks, where the desktop excels at them.
Welcome to my new personal blog. I've decided that I need a place to write my thoughts down, rather than spewing them over twitter or IM or other more realtime forms of communication where my hands type faster than my brain works.
And before you say anything, I know that there is no X, Y or Z. I'm running on a custom blogging app written using Django. I thought that this would be as good a project as any to learn Django with. There are some things that I haven't got round to yet (RSS feeds), some things I don't need quite yet (pagination) and some things I'm just not adding (comments).
Keep an eye out for new posts as I plan to write a lot of stuff up on here very soon and possibly transfer some posts from my old, now-defunct personal blog. Hopefully it will be interesting.