So people are lauding Flickr, suggesting that they’ll switch to it by way of protest, or as a more acceptable way of storing their photos.
But here’s the rub. Flickr’s a crock of shit.
OK. I’m being harsh. Here’s the reality, for me at least.
Flickr is a wonderful repository. It allows me to use a relatively intuitive user interface to upload my photos (and videos under 90 seconds in length). And it allows me to share these with fellow Flickr users that I deem to be either Family or Friends. I can tag stuff, group stuff, and map stuff. All rather lovely. (The 90-second cut-off is rather limiting and irksome, but not a major annoyance.)
It also has an add-on called Picnik that allows me to do some basic edits to the photos: cropping, filtering, rotating, removing red-eye etc.
But it’s not the storing of images that I have an issue with. My issue is with the user experience of the viewer. It’s appalling.
First, my friends and family. They must be Flickr account holders. In a day when Facebook is becoming the de facto standard for online identity (at least for social stuff), this is criminal.
My mum doesn’t want a Flickr account. And I want to share my photos with people I know, love and trust without forcing them to sign up to another service. I would estimate that 90% of the people I want to share my photos with have no interest in having a Flickr account.
And once they’re in, my view is that the user experience is at best, poor. My photos are presented to those I share them with sequentially, linearly. The first page shows my five most recent photos, all nice and big. If I flick to page two or beyond, the pictures become smaller, the 18 photos per page becoming more akin to a set of Windows Explorer thumbnails than anything more inviting.
I can click on any photo to access more sharing options, see where it was taken or to access a higher resolution version of the image. But it’s all so very functional.
To the viewer, the Flickr website has changed little in the four and a half years since I became a pro member, and changed little in the three or so years before that when I was a non-paying customer. It’s vanilla. It’s linear. It’s functional. It doesn’t embrace the user and take them on a journey. It doesn’t give the user the sense that they are experiencing the event, the concert, the playground, the dinner, the airshow, the beach walk with the user.
And it should. Yahoo! has the ability to bring photos to life, to create an absorbing experience that people want to come back to again and again. Montages, full-screen slideshows by default. It has the ability to exploit Facebook’s credentials (and user base) to draw people into its service, while at the same time converting an increased number of users into its premium service to pay for the platform.
Or else Flickr can continue being left behind by its competitors and, with time, become a relic of the internet.
I only hope you’re reading, Marissa.
On 22 November, I decided that enough was enough. I needed a new music solution.
All of my digital music was residing on my old, largely defunct laptop which I’d replaced in June. It was sitting in iTunes in a library that I was far from happy with, as I’ve documented in a previous post, titled How Apple ruined my music collection.
Now as many of you will be aware, iTunes is the biggest piece of shit ever to grace a PC laptop. I know many of you Apple fanboyz/girlz will wax lyrical about how wonderful it is on a Mac (although I understand that there are even Mac users who hate it). But on a PC, it’s supremely appalling. Dog shit, if you will.
But that aside, the problem with digital music is that it came too soon. People had big music collections. Mine weighs in at a respectable but by no means mind-boggling 5,500-ish tracks. At maybe 4Mb per song, that’s around 20Gb of music.
Computer hard drives could just about cope with such volumes when iPods were first introduced in the very early part of the new millennium. But iPods could not. They started at 5Gb, although they soon got up high enough to cater for my 20Gb.
But then smartphones were introduced. And these came with SSDs rather than spinning discs. This meant that they were faster, quieter and much more worthy of a hug. But it also meant that their storage capacity was limited. And it meant compromise. You were (I was) unable to store your entire music collection on your portable device. So you had to pick and choose.
Even today, over eleven years after the first iPod came out, my Google Nexus packs a rather paltry 16Gb of storage. But that storage is for everything. Currently, about 5.5Gb of it is used for apps, photos and data other than music. A further 2.3Gb I am unable to access (the Android OS, I expect). Leaving just over 8Gb for music, if I so choose. Not enough for my entire music collection.
Over the years, I’ve upgraded laptops a few times, and music has been lost along the way. I’ve restored partial music collections from iPods. DRM-ed music confuses the hell out of me, and I’ve slowly grown to loathe everything that iTunes is about. It could have been so wonderful. But instead it contributed significantly to fragmenting my music collection. (Every time I’ve upgraded my laptop, I’ve struggled long and hard about how to move my music across.)
Now I’ve often thought about buying a NAS. But I don’t really have a N to speak of to which I can A the S. And they sound that bit too scary. So I haven’t.
But then along came Google Music.
Overnight on 22 November, and throughout most of 23 November, my old laptop’s internal fan was in overdrive as the laptop was resurrected to upload 4,705 songs from its music library into the Google Play Music cloud. It was working. And I felt huge relief and excitement. (There are about 400 tracks thereon that won’t upload, but I’m not quite sure why. It may be something to do with DRM. They’re probably those ones with the funny icon next to them in iTunes, an icon that I don’t comprehend and that has no hover text.)
And now it’s there, it’s lovely. I can play it direct from the Chrome browser. No need for installs. Just lovely. Some of the metadata has been maintained from iTunes, including number of plays. (Sadly, the five-star iTunes rating has been replaced with one with only three levels: thumbs up, nothing or thumbs down.)
And while all of the music can be streamed from the Android app on my phone (which over 3G might rack up some big bills), I can also highlight specific music that I want to store locally. And that music has been downloaded to the Nexus to use up some of my spare disk space until such time that available phone storage exceeds music collection.
The only thing I’d like now is the ability to stream to my Sonos player. I’m expecting that’s on its way.
In the meantime, I’m happy. Happy that I again have a definitive music collection, one that is not tied to a device for the first time since I collected CDs.
My first experience of coding came courtesy of a ZX81. And later through the ZX Spectrum. On each computer, you didn’t need to type commands. Indeed you couldn’t. Each key represented a shortcut to a command: PEEK, POKE, PLOT, RUN, REM etc. If you were at the appropriate point in a line of programming, pressing a letter would yield a command instead of the letter itself. It was all rather convenient, if, in hindsight, a little limiting.
Then came the BBC Micro. The convenience of shortcut commands was no more. To RUN, users had to press three keys as opposed to one.
The world didn’t end. But at that point, I genuinely believed that programming would never take off.
You see, programming demands syntactic perfection. Commands must be spelt correctly. Semicolons must, where rules demand, feature at the end of commands. Quotation marks must surround certain types of text, and each function commands its own imposed structure.
And with such grammatical idiocy and general shoddiness surrounding us, I was of the belief that programming could never survive. I didn’t think that the general public could reliably be expected, with or without the aid of command shortcuts, to type faultless lines of code.
It was a brief thought during my early teens, perhaps earlier. And maybe it’s a sentiment that I carried forward in my career as a proofreader: no one can be trusted to write faultless English.
Thankfully I’ve been proved wrong. Code abounds, and compilers are technology’s proofreading equivalent.
That #Flickr is advertising as “New” the newly introduced “Upload” button in their primary nav. is what makes #Yahoo! doomed.
That Upload button is the most significant software upgrade in the last 24 months. #Flickr #Yahoo!
Reading Gizmodo’s article on a similar subject just now, I was quite close to the mark.
Since Flickr was bought by Yahoo!, there have been few technological developments that are visible to the end user. The odd screen may have received a makeover, and I think I can now see other people’s photos on maps. But beyond that, there’s little.
In fact, the main difference to me is that I was forced to start paying for the service (on 30 July 2008), as I needed to upload more pictures than the free account allowed. I don’t resent this. All services should be paid for.
What I resent is the lack of focus on me as a user by Yahoo!
I’d love to be able to log in with other systems’ credentials.
I’d love to be able to allow my Facebook followers automatic access to my protected photos.
I’d like a much richer interface for seeing my friends’ photos.
I’d like to be able to upload videos more than 90 seconds in length.
I’d like all photos taken on my smart phone to automatically upload to Flickr when in WiFi range, and for these only to be accessible by myself.
I’d like a much improved interface into my own photostream—timelines, montages, albums.
I’d like my geotagging of my own photos to have a positive impact on me, rather than merely fuelling other people’s views of the world.
Instead, I have none of these features. Instead, I have the privilege of losing access to all bar 200 of my photos if I ever stop paying.
Flickr will not die. Certainly not for a long time. Instead it will live a long dull life. Its user base will remain uninspired and unimpressed. And its functionality will creep forward while those around it bound past it. I wonder whether I’ll be a part of it.
Increasingly, we’re being pointed in our online worlds to things that other people read. BBC News lists its ten most-read articles. And more recently, Facebook flags to us the articles that our friends are reading on newspapers such as the Guardian and the Independent.
But to me, reading an article signifies one or more of three things. You might have an allegiance with the newspaper publishing the thing that you’re reading; you might have an interest in the subject being written about; or the headline advertising the article (for that is what headlines do) might have grabbed you.
But reading an article does *not* necessarily signify that you agree with it, nor indeed that it’s worth reading. And this is what I have a problem with in such sharing.
The buttons that now adorn our articles and blogposts are useful. They invite us to make an active decision to share the content (or not) and how indeed to share it. But I find those technologies that automatically share with our networks the things that we are reading trouble me. Content with a catchy title will be read. Its being read will be shared. And this will cause it to be read yet more.
My daughter, five, is growing up surrounded by technology. From TVs to smartphones, from Kindles to iPads. Spotify, Sonos, Bluetooth. They’re all commonplace.
But there are two technological advancements in particular that weren’t there during my childhood that to her are simply part of life. In fact, the absence of either one causes befuddlement on her behalf. They are: touchscreens, and rewindable TV.
If she needs the loo while watching TV, she will always ask, “Can you pause it?” And when we went to see my parents at Christmas, she was confused by their Freeview TVs not being pausable.
And similarly, my Kindle is an anathema to her. My wife’s iPad and our respective phones have conditioned her into thinking that all handheld devices react to touching the screen. When the Kindle doesn’t do anything, she’s bemused.
To me, as to many, there have been so many advancements in technology that it’s sometimes difficult to figure out where the most important leaps have been made. But instead of looking at what’s outstanding, maybe the key is to look at what seems like a no-brainer to the youth of today.
Facebook is there to connect you with your friends, old and new. Its purpose is to allow you to share moments of your life, and enjoy moments in other people’s lives, generally to bring people closer together.
Meanwhile, Google is trying to make your online experience more simple and rich, providing applications such as email, maps, calendars, documents to make things easier; the likes of YouTube and Picasa to make things richer; and a kick-ass search engine to allow you to find exactly what you’re looking for.
And Apple is intent on providing a beautiful and simplistic experience when connecting with technology, both in the devices themselves and the interfaces that they support.
That’s the gloss. Here’s the cynic’s view.
Facebook is intent on ensuring that when people connect with other people or organisations online, they do it in the confines of Facebook. And in so doing, they will amass a swathe of data about you and sell it to the highest bidder, allowing advertisers to sell their wares to you, whether you like it or not.
Google is intent on ensuring that when people want to do anything online, they do it through Google. And in so doing, they will amass a swathe of data about you and sell it to the highest bidder, allowing advertisers to sell their wares to you, whether you like it or not.
Apple is intent on ensuring that when people want to enjoy media online, they do it through Apple. And in so doing, they will take a large cut of the profits involved.
There is a fine line between being perceived to do stuff that your users will love and being perceived to do stuff that will abuse the trust of those users. Ask Microsoft. And maybe treading that line most carefully will decide who wins the battle of the user technologists.
The war continues between the major players in the personal IT services market. To most of Joe Public, they will mostly see the veneer. They’ll see the shiny iPad, the beautiful way in which Google Maps helps them get to grandma’s house, the way in which they can now connect so easily with people who might otherwise be strangers. But behind the gloss, evil is going on.
In the 1990s, Microsoft became huge. And along the way it became hated. It tried to use its position in certain markets to push into others, often forcibly. Internet Explorer being packaged with Windows was one such foray. Maybe this hatred goes with the territory of being big.
Now we have, among others, Facebook, Google and Apple.
Each of them wants to own certain aspects of our online experience. Apple is doing so through hardware: the iPad and the iPhone. Facebook is doing so through software: its website. And Google is doing both: its swathe of applications now being complemented by hardware in the form of the Google Samsung Nexus, although this latter facet is being done through partnerships rather than directly.
But these companies, particularly Google and Facebook, are no longer inspired by making their offering lovely for the customer. (I genuinely believe they once were.) The companies have shareholders and these shareholders demand inordinate returns on their investments. And to do this, each company must milk its offering for all it’s worth.
In doing so, they will be perceived as being evil. They will open up your data to advertisers and use every opportunity to increase the value associated with your using their offering.
Arguably, the current king of the three is less into this business. Apple is about building beautiful hardware and operating systems. It’s less about building apps that can maximise the value of your interactions while using them. My betting is that with time they will move more and more towards this area over the coming years.
To use a mathematical analogy, there is, I expect, a maximum that the companies need to find. If they don’t push enough to maximise the value of our interactions with their services, then they will lose out to those players that do. And likewise, if they go too far in using that data, customers will become tired and go elsewhere. Finding that maximum will be the big challenge.
It’s going to be an interesting race to watch.
It’s a little over three weeks since I made the bold decision to sever links from the iPhone in favour of the Samsung Galaxy Nexus. Here’s a brief accounf of the reasons, and my experience thus far.
So first of all, why? I was previously on an iPhone 3GS. Each iOS upgrade came with new functionality but degraded the speed, and towards the end, the operating system became unacceptably slow to the point that a hardware upgrade was necessary. (When did we start talking in this way with respect to phones?)
My immediate question was this: should I upgrade to the 4S or should I wait for the next Apple phone release? Decisions, decisions. But then I realised how stupid I was being. Apple had brainwashed me, or perhaps I’d brainwashed myself, into thinking there were no alternatives to the iPhone. So I searched beyond the walls of the mothership.
Before doing so, I double-clicked the iPhone Home button. This told me which apps I used, in the order in which I’d last accessed them. And it soon transpired that the vast majority of apps I accessed had Android equivalents, most of which were free. My only area of concern was mobile music. More on that later.
The only viable alternative to me seemed to be an Android phone. And it soon came down to a toss-up between three phones: the Galaxy SII, the Nexus S and the Galaxy Nexus. Without having seen a Nexus beyond the odd billboard ad, I opted for this. It was new, it was cool, and it had Ice Cream Sandwich, whatever that may mean.
And so far, I adore it. Here’s why.
First, navigation. There is a three-button screen-based navigation bar that comes with the operating system. It sits at the bottom (or to the side in landscape mode) of each app, and allows you to do three things: go back; go home; or go to a screen allowing you to scroll through your open apps. The back button works within the app you’re in; the other two take you beyond that app. And they sit together beautifully. They make sense.
Next: sharing. The operating system is beautiful for sharing stuff. In-app buttons invite you to share using the apps and accounts you’ve connected to the phone or the technology that it comes with. Clicking on a tweet, I can immediately share it using: Bluetooth, Facebook, Gmail, Google+, SMS, Facebook Messenger, Note (a Post-It app), Twitter itself or WordPress. For me, photos have further options of Picasa, Flickr or Send to BBC News. It’s just so simple and accessible. And I use it lots.
Next: notifications. They’re reliable, obvious yet unobtrusive. They sit in a little bar at the top of the screen and you can ignore them or investigate them. They just work. I found that beyond email and calendar, iPhone notifications were unreliable. And the grouping of apps that came later versions of the OS didn’t sit well with the red dot that appeared when apps had notifications. Android seems to have this nailed.
Now: the camera. Comparing it to the 3GS camera, my early view is that the photos are a lot clearer. And there’s a lush feature that allows you to capture panoramas from the onboard camera app. Here’s a rather good example that I took at the weekend.
Finally: Google. I find using Gmail is a dream compared to the iPhone equivalent app. That said, the Google Calendar app. is appalling and needs an overhaul. I’m sure that’s on its way. And somewhat oddly, I don’t enjoy the Google Maps app. quite as much as I did the iPhone equivalent. But the difference here is minimal.
The only big drawback thus far is music. I’ve tried using Songbird, but haven’t yet had sufficient time (nor inclination) to figure out whether this is my solution. Early tests suggest that its conversion of AAC files to MP3s can chop off a bit from the start and end of the track. But there’s more work to be done on this front.
Thus far, the experience has been utterly pleasurable. I urge you to think outside the box.
Update: the only specific comparison I made between the 3GS and the Nexus related to the camera. This is somewhat unfair, as the iPhone 3GS had a three megapixel still camera, while the iPhone 4 comes with an eight megapixel one. (The Nexus camera is five megapixels.) All other comparisons are valid, as they refer to the operating system and apps, rather than the hardware itself.
The people I know are, on average, more technically savvy than the average person in the UK. And I am, I would say, more technically savvy than the average person I know. Yet I am confused.
In all honesty, I can’t be arsed. I do Twitter—my 17,417 tweets are testament to that. And I do Facebook, in a rather half-arsed way. And yes, I blog, albeit less vociferously than I have done: 1,766 posts over the last seven and a half years.
And for me, from a social perspective, that’s enough. I use other tools professionally—Google Apps, Dropbox to name but a couple. But socially, for me, there isn’t space for anything else in the virtual world.
Twitter fuels my interest in things. And Facebook fuels my interest in people.
There are overlaps. But Twitter is not sufficient to fulfil my online friendship needs (although over time, it’s become more important in this regard). And Facebook certainly isn’t sufficient to fulfil my interests (and this position has changed little over time).
Pinterest, Tumblr, Posterous, Google+, Yammer. Maybe I’m missing out by not participating. But to be frank, I have neither the time nor, more to the point, the inclination.
I took a squizz at www.gov.uk the other day. Below are my very high-level views.
It looks like the search engine is pretty intelligent, more so than any I’ve seen outside Google. But stray far beyond the topics you’re invited to search on and it struggles. I’m not sure whether this is because the search engine isn’t that good after all, or whether it’s because the content just isn’t there yet.
A search for unemployment gives residential training for disabled adults as its first result, out of a total of only seven results, most of which are irrelevant.
Their “Related Topics” module on the right seems quite relevant to the articles, for what it’s worth.
Jesus. It’s phenomenally flat. Maybe the idea of a tree structure is out of the window now. If so, they’ve embraced this concept. I have no idea where I am in the site. Maybe if I’m only there to look for a specific thing each time, this doesn’t matter. But to me it’s disconcerting.
It’s dumbed down. I thought that Directgov was aimed at the right level: non-complex content in relatively bite-sized chunks. This is a lot more bullet-y and assumes very little of the reader. Maybe it appeals to the masses. It doesn’t appeal to me.
As I say, it’s high-level. But these are my findings nonetheless. Controversial?