Well, last week I did something I should have done ages ago… I picked up a laptop specifically for use “on location” in my occasional photojournalist-type activities.
That’s to say, one I can carry with me to an event and use to post to blogs or upload photos or whatever in “real time” should the need arise, and particularly where the event lasts for longer than just a day.
In other words, I’m unlikely to lug it around with me if I’m just going to cover a “one off” demo. But something like the recent Climate Camp… well, that’s another matter entirely.
Although I’ve been aware for quite some time that I really should have such a bit of kit, the actual incentive to get it was driven by two separate and specific experiences.
The first occurred during (or rather, immediately after) my coverage of the Carnival Against the Arms Trade in Brighton, back in June this year (see this post).
At the end of the protest/demo I found myself at a sort of club or social centre frequented by activist types where some of the locals were desperate for photos to send out straight away to the media, press agencies and suchlike.
Well, me being on hand I was asked if I’d mind any of the pics I’d shot being used.
Stupid question. Of course not. That’s what I was there for.
But we then ran into a huge problem. I shoot in RAW only. And their set-up wasn’t geared up to process RAW files. And even if I’d processed the shots in-camera into JPEGs they had no means readily available to transfer them to their computers. Nope… not even a card reader!
Well, after a lot of messing around, and downloading/installing some software, and me madly dashing out to get a USB card reader from the local Maplins I eventually managed to get some pics sorted that they could use.
But alas, too late. Another photojournalist-type had pipped me at the post. Dammit!
Realistically of course, being that it was a one-day event I probably wouldn’t have taken a laptop (along with all necessary gubbins) anyway… assuming I’d had one.
What the incident did cause me to do though was make damn sure I always carry, at the very least, a card reader with my photo kit.
Nothing like closing the stable door after the horse has bolted!
The next experience was at the recent Climate Camp (this post refers), which lasted for about a week and was exactly the sort of event at which I should have been kitted up.
But oh no, not me! I’d relied on the good offices of a mate who’d said he’d bring his laptop.
Problem was, there were three of us photojourno-types wanting to use it… and yep, you’ve guessed it… almost always at the same time.
And the other problem was internet access. The Camp’s internet link didn’t get up and running either as quickly or as smoothly as we’d anticipated. Ending up with mate eventually having to journey into the local shopping centre and buy one of those web’n’walk dongle thingies that enable mobile access to the internet.
Top marks and full credit to mate for doing so, but it proved to be painfully slow uploading pics etc, and of course it didn’t resolve the “three users bottleneck” issue.
It was the guilt that did it for me. Guilt at having relied upon someone else (nothwithstanding their kindness and generosity) for providing a facility that I should have been equipped with anyway.
Well, now I am! Laptop, plus one of those web’n’walk dongle thingies. Ye gods. I’m actually beginning to feel like a proper photojournalist. Its quite worrying really.
Anyway. Before acquiring this nifty bit of kit I’d already pretty much spec’d it to myself…
It had to be lightweight. I’m already carting around more than enough weight as it is (what with my stupid two cameras and associated accessories) and really don’t need to stuff another weighty lump into the backpack.
It had to be fast. Loadsa RAM and a reasonably fast processor to cope with doing stuff with RAW files using Lightroom.
Had to be wifi-enabled (of course), and ideally running XP.
Hmm… p’raps I need to explain that last.
The obvious thing to go for (and much favoured amongst pro photographers) would be a Mac. Thing is, I have issues of long standing with Apple and thus on principle won’t get any of their gear. Stupid I know, but that’s the way it is.
And whilst I’d briefly considered Linux I had reservations about its handling the sort of work I’d want to throw at it (reinforced by a bit of research I’d done) plus Linux can sometimes be a bit of a pig when it comes to communications-type stuff (i.e., connecting to an internet access point). Which would rather defeat the whole object of the exercise… no point carting a laptop around if you can’t upload stuff etc straight away.
So really XP was the only viable option (didn’t fancy Vista after all I’d heard about it).
To proceed then… it had to have a reasonable size screen. Didn’t need something huge like I have on my “PC replacement” laptop, but neither did I want a dinky 12″ (or less) screen along the lines of what I’d seen on a rather super little notepad. My concern was that such a small screen size could make photo editing/processing a tad onerous (especially with my eyesight!).
Didn’t need a huge hard drive… I favour the external USB-powered portable hard drives for data storage etc… and I’ve already got a few of those knocking around somewhere.
And a useful battery life.
So, that was the spec. And I… um… nearly got close to matching it.
What I’ve actually ended up with is a really cool-looking bit of kit (its shiny black with a sort of cherry-red handrest and gorgeous blue lights!) that’s… er… bent the spec a bit.
Well, the screen size is ok. And the hard drive isn’t too big (160Gb). And its not that much heavier than what I’d planned for (just over 2kg). But it is wifi-enabled (“They all are nowadays” did I hear someone say?).
But hey, its got a huge amount of RAM (4Gb) and reasonably fast (two point something or other GHz) dual processors.
Works a treat. Its great. Mind you, I haven’t yet tried carrying it around with me (along with all the other kit). Still, deal with that when I come to it.
Oh… I nearly forgot to mention… its um… running… er… [hush!] Vista.
In fairness I have to say that so far my experience with it has been pretty good. I’ve already tinkered around with it a fair bit (as geek types tend to do) and I’ve not yet encountered any of the slowness or “hangs” that rumour had me believe would be inevitable. And, despite what I’ve heard others say, I don’t find the interface impossibly different to that of XP (so no huge learning curve). In fact, on present showing I’m having difficulty restraining myself from remarking that this is the best MS Windows os to date (bar Win2K and NT which still outstrip everything else of course).
And (oh joy!) I’ve finally got my head around using the touchpad instead of plugging in a mouse. Now I’ve configured it how I want it… which essentially means switching off that damn irritating tap-to-click feature. Great!
I’ve now discovered that all that stuff about colour calibrated monitors, “correct” colour balance and suchlike (you know, all that stuff that’s caused me so many headaches in the past) is just a load of nonsense. Utter rubbish. Sheer twaddle.
New laptop grasped in my grubby little mitts, what’s the first thing I do? Plug it in, flip up the lid, and switch it on of course. Get it fired up and start playing.
Tinker with the interface a bit. Yeah, I can handle this. Cancel that bloody tap feature on the touchpad. Good. Find Explorer and set the view to how I want it (never did like Explorer but it’ll have to do ’til I can afford to get the Vista version of my fave file organiser, “PowerDesk”… and for some bizarre reason money now seems to be really tight. Can’t imagine why!).
Rummage around in the registry a while (“Hmm, not having that starting every time I boot up” sort of thing). Connect to my router (now that was fun and games, but I sussed it eventually… all to do with tweaking some settings on the firewall I’d just installed). Update the AV proggie I’d also just installed. Run Windows update. Install a few other things. (Btw, rummaging around in the registry is not recommended unless you’re confident in what you’re doing!) Install a few more proggies. Begin to get bored.
Drum fingers on desktop. A few slurps of coffee. Bit more drumming. Randomly click a few keys and stroke the touchpad (hey, I could get quite used to that!). Definitely getting bored now.
Aha… check out Flickr. Call up the newly-installed Firefox (just like an old friend) and go to my Flickr page.
All the photos are crap! They’re ‘orrible! But after a little bit of investigating I find that so, too, are everyone else’s. Well, that’s some comfort at least.
So spend the next 45 minutes or so twiddling and tweaking the colour adjustments for the screen. Its still not quite right, but at least its a bit more acceptable now than it was straight out the box.
And, so we’re all very clear on this, by “acceptable” I mean not too dissimilar to the displays on my other machines… allowing for unavoidable hardware differences.
And note that this task did not feature as the first (or even second, or third, or…) thing I did when I opened the lid! Ponder on that.
The whole point being, how many other people would be as finicky as myself in trying to make such colour adjustments? Change the brightness a bit maybe, possibly the contrast, and maybe even the gamma, but that’d be about it. At least, that’s it for most of the people I’ve observed sitting before a new machine.
And what of all those people who never even think of adusting the screen colour, brightness etc, assuming how it looks “out the box” is how its meant to look? (Which is pretty much what I did actually!)
Another point being, I’ve now got… um… four laptops (forget the PCs, and the machines I use elsewhere) all of which render colours slightly differently or with, if you like, a different colour bias.
So which one’s right?
In the good old days when I used to be involved with the printing industry and graphics pretty much everyone worked to a set colour standard (the generally used one being the Pantone system). You could quote a Pantone number (which referred to a specific and unique colour or shade) and pretty much guarantee that, regardless of the processes involved, the printing machine used, the brand of ink used etc, that colour could be reproduced identically from one print run to another, from one printing house to another. (Assuming the colour of the substrate was the same of course.)
It could be consistently reproduced by the use of what were called “spot” colours, or it could be used as a standard by which the product of colour process printing could be accurately measured.
And of course all graphics-type people always had on hand a Pantone colour book (“book” is a bit of a misleading term really. It was actually a thick wodge of colour swatches that had to be replaced fairly frequently cos of colour fade) as part of their standard equipment.
And I’m guessing that sort of standard and consistency still applies if one’s photos are to be actually printed in hardcopy form.
But what of web usage? Far as I can see if the web is the intended final form of presentation for one’s pics then all such standards fly completely out the window.
This is a problem I spent ages tussling with not so very long ago, with the output of the GX20. And I’m reminded of a phrase I read when, in the course of tussling with that problem, I started reading a helluva lot about RAW files and how they’re processed…
“Generally, there is no one single “correct” interpretation of a given raw format. Vendors make a relatively subjective determination of what the best “look” is, and then adjust their converter to produce that result.”
Whilst that refers to the interpretation of a RAW format, I’m beginning to suspect it can equally as well be applied to the notion of “correct colour”.
You can photograph a particular scene in, say, the brightness of the midday sun. Then exactly the same scene in the early morning. And in the evening. On an overcast day. On a rainy day. On a day when the sky’s laden with snow. And so on. Even just hours apart on the same day. And each one of those photographs will show (very often quite markedly) different colours. We all know this, don’t we?
So which one’s “right”?
Remembering that I’m talking exclusively about web usage here, we can then add to this mix the fact that we can’t know what screen a given person’s going to see our pics with, or how carefully they’ve adjusted the colours, brightness, contrast etc… if at all!
And two supposedly “correctly” adjusted screens from different manufacturers may well render colours significantly differently.
So what, if anything, can we do to introduce some degree of consistency into this? Some way of ensuring that the way I see a particular image in terms of its colours, brightness etc will also be seen by someone else, using a different screen.
Perhaps that’s an impossible requirement. I don’t know. I wish I did. And no doubt I will periodically revisit this dilemma (as I’ve already done) many times in the future.
But until I learn of a way of satisfactorily resolving it, all talk of colour accuracy and worrying about consistent colour reproduction on the web is, far as I’m concerned, a load of nonsense!