Archive for the 'Information technology' category

Mining bitcoin like it’s 2016

Let’s start this piece by making it absolutely clear that I don’t know what I’m talking about. I’m not a cryptographer, a banker, or computer hardware expert. A couple of things have recently piqued my interest in BitCoin: the financial impact following the UK referendum on EU membership, and a Product Tank talk I attended yesterday evening centred on “FinTech” (financial technology).

Why I care about Bitcoin

My highlight of that Product Tank talk was given by Lars Krüger, Head of Product at Blockchain. Blockchain’s tagline is “Be your own bank”, and they provide a handy smartphone app which allows you to manage your Bitcoin wallet. A bank account where the currency is Bitcoin. Bitcoin is like other currencies except not run by a central bank… and here is where every article on the subject just falls apart because there’s too much to explain all at once. You could read the Wikipedia article on it, but that’s fairly terse. Or you could just try and catch up by osmosis, which I imagine is how most people cope.

There is certainly a discussion to be had around “why currency is simply trust”. There’s a lot of talk about Quantitative Easing, negative interest rates and basic incomes. I’m finding all this very fascinating and there’s much to discuss surrounding all that, but let’s try and steer this back towards Bitcoin mining.

Following the UK referendum on EU membership, the British pound is now worth a lot less than it was compared with the US Dollar. Living completely inside either the Pound or Dollar ecosystems that doesn’t mean much (ignoring external market influences beyond the scope of this article) but outside of either of those systems it means a lot. If I had bought £10,000 worth of USD on the 22nd June, and sold them on the 27th June, I’d have made over £1,250.

In order to spread risk then, it makes sense to hold funds in a variety of currencies. While I could buy US Dollars or Indian Rupees, Bitcoin holds a lot more interest, and one reason is that I had heard it is possible to mine them. That’s the concept of putting your computer to work and being rewarded in Bitcoin. Not having to buy them, but instead putting assets to work behind the scenes to build a residual income. Sounds fantastic.

So again I stress I’m no economist, but I can grasp that the value of the currency we exchange today is no longer based on anything other than trust, and it’s that trust and confidence in comparative currency values that influences comparative currency values. I’m happy that it is okay to allow new money to be created so long as it is done at such a rate as to not undermine confidence in the value of a currency.

Finally I get to the point of this section. To mine Bitcoins you need electricity and beefy computers. I have solar panels and some handy computer gear lying around. How hard can this be?

Making it difficult

If we could all print sheets of £20 notes at a material cost of a few pence, cut them up with scissors and have them accepted as legal tender, the economy would fall apart. There’d be no trust. £20 today would be worth less tomorrow. So it can’t be easy to mine a Bitcoin. If I was awarded a Bitcoin for simply clicking a button in a computer application, Bitcoins would be worthless.

But Bitcoins are far from worthless. Today (28th July 2016) a Bitcoin is worth £501.60. It needs to be difficult to earn even a fraction of a Bitcoin. And it is. Very hard. Pointlessly hard.

Briefly, in order to mine a Bitcoin a computer must perform millions of millions of mathematical computations. That takes time, electricity, and the hardware costs money. In order for Bitcoin mining to be worthwhile, it needs to be possible to make a higher value in Bitcoin than your hardware and electricity expenditure.

Again with a solar energy surplus, some good computers and with a single Bitcoin being worth £200 more now than it was at the turn of the calendar year, I thought I was in a strong place to start. No. Not at all.

Measuring your mining speed

Mining a single Bitcoin all by yourself is basically impossible. I was told that yesterday, and 24 hours later I believe it. The alternative is joining a shared pool of miners, who all work together and divvy out the rewards in proportion to the effort put in. When I describe the economics of that, the concept of solo mining being pointless should clarify itself.

My main desktop PC has a quad core i7 with hyper-threading (exposing 8 virtual cores) running at 3.7GHz. It’s a few years old but it certainly does the job. I never have to wait for the CPU. It turns out that if I dedicate all 8 of those cores to Bitcoin mining, I can achieve a speed of about 2 million hashes a second. A hash is a mathematical calculation. Look up the details if you care and want to know more.

Anyway, 2 million per second sounds like a lot. I had heard that actually a graphics card is rather better at this hashing business than a CPU. It sounded strange, but it’s got something to do with polygon rendering in 3D video. Anyway it’s true, and as luck would have it I love to have loads of monitors hanging off my PC, so I’ve got an AMD Radeon R7 200 something or other and it can do about 175 million hashes a second. That’s right! It’s getting on for 90 times faster than my CPU!

And here’s the best bit. For reasons that are rather dull I’ve got two machines of similar specification, and when it is sunny, easily enough solar energy to run both of them. So I should be able to purr along at 350 million hashes a second for free. This was going to be easy money.


Typically a mining pool will pay out when you’ve earned 0.1 Bitcoin. Remember that’s about £50. So I’d need to earn around £50 before I’d see any return. At 350 Mh/s (Mega (million) hashes per second), do you know how long that would take?

Really though, what’s your guess? Maybe that would take an absolute maximum of a kilowatt to run and a kWh would cost you 15p if you didn’t have the benefit of solar panels. It would therefore take 333 hours (nearly half a month) before it cost £50 in power, so clearly one needs to earn 0.1 Bitcoin in less time than that.

What was your guess then? The answer is, that running constantly at 350Mh/s, it would take over 1,500 years to earn 0.1 Bitcoin. Now I’m renowned for a lack of patience, but I suspect most people wouldn’t wait that long.

How can it work, then?

I was surprised that my GPU (Graphics Processing Unit) was 90 times better than my CPU at this business. It turns out it is possible to buy apparatus designed for Bitcoin mining. “Antminer” is a common choice – a brand that has a evolutionary series of products purely with Bitcoin mining in mind.

I looked into this. Here’s a table comparing my two existing PCs and various Antminer products.

Rig Power consumption (watts) Gh/s (giga-hashes per second) Power cost to earn £50 Hardware cost at today’s prices Time taken to earn £50
2 PCs 900 0.35 £1,773,900 £0 1,500 years
Antminer S1 360 180 £1,380 £18 3 years
Antminer S3 366 400 £631 £65 1 year 4 months
Antminer S7 1,200 4,700 £176 £350 41 days
Antminer S9 1,375 14,000 (!) £68 £2,400 14 days

Maybe my figures are based on some less than accurate estimates. Still, unless I’ve got something very wrong here, this is bonkers. None of them make any sense based on a typical UK electricity tariff of 15 pence per kilowatt hour.

Even if they could be entirely funded by solar energy and we ignore the capital outlay on that solar rig, the S7 would take 6.5 years to break even, and the S9 nearly 2 years.

In conclusion then, if you have infinite free power (you’ll probably need A/C too), you can steal the custom hardware, and you can afford to wait forever for a return, Bitcoin mining might be worthwhile. Otherwise it isn’t.

iPad ownership

When Apple’s iPad was first launched I was openly skeptical about it. What was the point of this device? It wasn’t as mobile as my phone, or as powerful as my trusty laptop. A satirical video spoof of a man out jogging with an iPad strapped to his arm, and of course all the gaffes about the name better representing a sanitary towel amused me greatly.

A colleague (and long established Apple fan) was quick to snap one up, so it wasn’t long before I got to see an iPad in the flesh. Then, still, I mocked the device, and (carefully!) used it as a iBeerMat.

However, last Christmas I was fortunate enough to be gifted an original 32GB wireless iPad. Given my previous iPhone experience it didn’t take me long to get everything set up. After an initial period of fascination with what is certainly a very interesting gadget, its real benefits started to show.

A year on, this device (yes, this post has been written on my iPad) is used far more than the workstation, laptop or my mobile while I’m in the house. It doesn’t do any single thing well enough to warrant its price tag, but it does so many things that I’d now swiftly replace it if it ever broke.

On that topic, the screen is (understandably) fragile. I managed to crack mine during a spectacularly malcoordinated mishap. Fortunately I sourced a replacement screen kit for around £40 (thanks, eBay!) and while the outer edges of the device show the signs of my rudimentary intrusion, it’s all working okay.

Use cases

The various catch up TV apps mean I can watch television no matter where I am in the house, from the bedroom to the bathroom and of course the kitchen. On the subject of the kitchen, the web browser there is incredibly useful for recipes, and while the iPad is mounted on the wall the touch screen means the talented among us can even navigate the web with our noses while our hands are covered in sticky ingredients.

An iPad is substantially easier to take on holiday than a laptop, and is just as good if like me your only needs are to stay on top of email and social media while you’re away. Oh, and for taking your own films to watch (try CineXPlayer) on the plane or in the hotel room.

I’ve used the notepad feature a great deal, while studying or taking measurements etc. When our daughter came along, the iPad became an essential part of the baby feeding kit during the early days when it was important to note down all her intake – we still know the details of every feed she had during her first month at home!

While watching TV I’m often distracted by questions such as “what other film have I seen this guy in?” or “surely that’s not factually correct!” – well, the IMDB and Wikipanion apps come straight to the rescue there. Sure, I could look this up on the phone or the laptop, but (and I can’t believe I’m saying this now) the laptop is too big and slow to get going, and the mobile does have a rather poky screen.

The battery life is good too – the iPad craps all over both the iPhone and my laptop in that department.

I use the reminders application a lot to keep on top of my domestic to-do list. If I think of something while I’m out and about I can add the task from my phone and it’ll be waiting for me on the iPad when I’ve got time to crack on with things.

It’s also my home device of choice for staying up to date with Twitter and Facebook, although the lack of synchronisation between Twitter apps gets quite tedious (ooh I’ve got a new mention; oh no, I saw that hours ago on my phone). Some web sites are better to read on their free iPad app than the original web version due to fewer adverts – Mashable is a good example here.

Finally (for now!), it’s a great portable radio which I use a lot for radio 4 and cricket commentary.


Overall then, I can’t believe what a fan-boy I’ve become. The iPad is a great device. There, I said it.

PSN PlayStation Network Downtime

Update 26th April 2011 – 21:00 GMT

Sony have released more details on their blog – personal data has indeed been compromised. The important bit:

Although we are still investigating the details of this incident, we believe that an unauthorized person has obtained the following information that you provided: name, address (city, state/province, zip or postal code), country, email address, birthdate, PlayStation Network/Qriocity passwords and login, and handle/PSN online ID. It is also possible that your profile data, including purchase history and billing address (city, state, zip), and your PlayStation Network/Qriocity password security answers may have been obtained. If you have authorized a sub-account for your dependent, the same data with respect to your dependent may have been obtained. While there is no evidence that credit card data was taken at this time, we cannot rule out the possibility. If you have provided your credit card data through PlayStation Network or Qriocity, to be on the safe side we are advising you that your credit card number (excluding security code) and expiration date may also have been obtained.

So, if your PSN password is associated with your PSN email address anywhere else, time to change all other instances.

What is unavailable?

At the time of writing Sony’s PlayStation Network (PSN) has been unavailable for over 5 days. As an avid user of online gaming (mainly the Call of Duty series and Gran Turismo), this has now certainly reached the point where I’m getting a bit grumpy about it. Had we not had such good weather over the weekend, I’d probably be fuming! It’s still possible to play single player games and watch movies etc, but online gaming as well as internet based services such as LoveFilm aren’t available. The incredibly poor photo on the right shows what happens when I try to log in: “PlayStation®Network is currently undergoing maintenance”.

Why is it unavailable?

PSN is down because it was hacked last week. Well, Qriocity (pronounced ‘curiosity’) was hacked, that’s PSN’s media streaming service. Sony appears to have taken the view that it is best to take the entire service offline while they sure up their security.

Has any information been leaked?

This is a very interesting question, and one to which we haven’t yet got any answers. [See above update; we have now!] In order to sign in to PSN you need a valid email address. There are about 70 million registered accounts, so that’s quite a haul of valid email addresses. If those have got out, expect some spam (links to malware sites enticing you with Adobe products appear to be most in fashion at the moment). However, the real haul would be the credit card details many users have on file to permit them to easily make purchases from the PlayStation Store. Sony offer all sorts of media from game extensions to the latest films for hire, so I’ll bet a good chunk of those 70 million users have credit card numbers stored along with their addresses on the PSN. If those have got out, it’ll get rather more interesting for everyone.

Why was PSN hacked?

As yet there’s no official answer. It could be simply an attempt to harvest the email addresses and credit card numbers mentioned above. I think there’s a good chance this event can be traced right back to the launch of the PlayStation 3 Slim in September 2009. Prior to that point Sony had an extremely geeky USP in that the original PS3s offered OtherOS – or the ability to install any operating system (normally Linux) on the unit to sit alongside Sony’s. Just before the slim was launched Sony enforced a software update that removed this functionality – probably due to game piracy concerns.

To cut a long story short this suddenly made it a challenge among the hacker community to find ways around Sony’s attempted block. Then followed the usual ping-pong match of loopholes being found and patches being released, but the sparks soon turned to flame when Sony filed a lawsuit against renowned hacker George Hotz (Geohot) who had published his PS3 jailbreak technique on his website. Many argue that Sony deserved this for removing one of the console’s key features – essentially they have mis-sold the product. Sony I would imagine see this as an unfortunate side effect of protecting the reproduction rights of game creators. While Sony and Hotz have reached an out of court settlement, speculation is rife that Hotz’s supporters have led this breach on Sony’s network.

When will PSN service be resumed?

Right now the best sources of information would be the PlayStation blog, or their twitter account @PlayStationEU. There’s also this entertaining site.

You should have bought an XBox!

I know some of you just can’t wait to cram the comments box with this sentiment, so I thought I’d save you the energy. ;)

Zend Certified Engineer

Yesterday I took (and passed) the Zend  PHP 5.3 Certified Engineer exam – more information here from Zend and on Wikipedia. Having worked with PHP for about 11 years I had first imagined that this qualification wouldn’t be particularly stressful – that didn’t transpire to be the case! It turned out that I covered about 60% of the syllabus in my regular work patterns, but even then I relied on the PHP manual a lot more than I had realised.

The last (credible) exam I took was probably at university, and upon reflection I got myself considerably more wound up about the ZCE than anything in higher or further education. The last time I got this stressed about an examination was my driving test. The two events were similar in that I considered myself more than competent in the appropriate field, yet the exam stood between my assertion and independent confirmation.

I booked the exam 3 weeks in advance (as soon as we got back from Thailand) and spent the first two weeks gently revising. Then I sourced my first mock test, and the horrific reality of the situation became clear: not only did I need to learn a lot more about the aspects of PHP that I used every day, but I also had a whole load of new topics to cover. The less technically interested can leave now with the following summary: I revised like my life depended on it for a week, and yesterday was I victorious which was a massive relief. Those who want to know more about the PHP 5.3 Zend Certified Engineer exam, read on…

Note that when I took the exam I signed a disclaimer promising not to reveal any of the questions, so please don’t ask me for them. I can best sum up my horror with the following mock question I dug up from the internet, which asked what the result of this code would be:

echo strlen(sha1(‘0’, true));

Now I’ve since asked a couple of people much cleverer than me about this, and I was pleased to note that neither of them knew what the second parameter to sha1() was (even though it’s the same as the second parameter to md5(), ha!). Still, unlike me, they both guessed that it meant it’d yield the raw binary output, and as such proved they’re naturally in a position closer to passing than I was a week ago. There were more though, like what does the second parameter of count() do, and what does this output:

echo “hello123” . 34 + 4 , 123 . 11;

Then there’s the next issue: this was a PHP 5.3 test. I started working with PHP 3, and today work with the current enterprise release on CentOS which is 5.1.6. I chose to do the 5.3 exam partially because it was the only one available, but I was also convinced of its merit because RHEL 6 has just been released with PHP 5.3(.2), so this qualification should stand up for a few years yet. The issue of course is that being a creature of habit I hadn’t used all of PHP5’s functionality, and I hadn’t even been exposed to the 5.3 software at work.

Fortunately my home server now runs Fedora 14 and so had PHP 5.3.6 which was excellent for a revision platform. Still we’ve got some (IMO very good) custom frameworks at work, which meant I hadn’t had direct exposure to PHP’s PDO, mysqli, SimpleXMLElement, DomDocument and streams functionality. I managed to hoover all this up in a week, and in doing so implemented some nifty custom scripts including a revised RSS reader to HTML module to provide work blog content on various web pages.

There were some moments during the test when I doubted its quality; for instance I needed to recall in free-form the allow_url_fopen ini variable, but at the time couldn’t remember if it was in fact called allow_url_open. In the real world under such circumstances I’d just open the /etc/php.ini file and search for allow_url_ – this would hardly cost me any time – yet the exam would only reward me for the exact answer.

Ultimately passing the test and obtaining Zend Certified Engineer status has been extremely satisfying. While I don’t think it’s a perfect measure – the very fact that I can pass (and there’s no higher merit pass) means that those who clearly know more than me can’t achieve further recognition – it does set a useful standard. In my role as an employer I would certainly take any prospective PHP engineers with ZCE status seriously.

PS3 – replacing HDD (hard drive)

I’ve got a 40GB chunky PS3, and I’ve recently bought GT5 (Gran Turismo 5). It’s a good game, but it spends a lot of time ‘thinking’, especially when loading circuits. To improve matters it has an option to store a lot of this information on the PS3’s HDD, but it apparently needs around 15GB, and I just didn’t have that much space to spare, so I’ve replaced the drive. There are 3 distinct steps to this:

  1. Backing up the data (wouldn’t want to lose valuable game progress!)
  2. Physically swapping the drives
  3. Restoring the data

The first issue I faced was getting the PS3 to recognise an external drive. I located the backup utility under the Settings menu, but it wouldn’t see my 4GB USB stick. A bit of reading revealed that it had to be formatted in the FAT32 file system. I popped it in my PC and did a quick format and plugged it back into the PS3, but that didn’t help. So I did a long format, and then returned it to the PS3 and made sure I could see it under the Video menu – I could. I then returned to the backup facility only to be told that it wasn’t big enough. This much was of course obvious to me; I’ve got a lot more than 4GB of data to backup, but I was hoping to be able to transfer it in chunks. This doesn’t appear to be the case.

I therefore grabbed an external 250GB USB drive. This was already formatted with the ext2 filesystem for my Fedora box, so when I plugged it into my PC it was ignored and I had to use Windows 7’s disk manager utility to sort it. If you need to do this, right-click My Computer, choose Manage and you should find it in the tree under Storage.

Using this tool I created a 32GB (32768MB) partition on the drive – this appears to be the largest size supported by this tool on Windows. If you need to backup more than 32GB of data, try this tool (without warranty!). Anyway, once I’d created the partition the drive appeared in My Computer and I was able to use Quick Format to pop a FAT32 file system on there. I then connected the drive to the PS3 and it was recognised, and we were in business!

So as you can see, it’s not a swift process. It may be worth noting that there’s a PS3 to PS3 backup utility which works over a direct ethernet link. However in an attempt to honour copyright protection this process wipes the HDD once it has finished, so I chose not to risk the data on my PS3 slim. Frankly given how long it takes they should supply another PS3 to keep you entertained while you wait! Plenty of time to blog though! Eventually it finished, and just to be sure I attached the drive to my PC and saved a copy on my server.

Then it was time to change the drive. Tools required: a small flat-head and a small cross-head. Use the flat-head to pop off the plastic cover:

Then undo the centre screw (blue on my machine).

This allows the HDD casing to be slid right and extracted.

Here’s a comparison of the outgoing 40GB 5,400rpm Hitachi on the left, and the new 500GB 7,200rpm Seagate on the right. I hope the extra spin speed translates to more performance!

Refitting is as ever the reversal of removal. Once done, fire up the PS3 and it’ll want to format the new drive:

That doesn’t take long.

Then connect the external drive, go back into Settings, find the backup utility and restore!

The restore process was quicker than the backup, but I’m not sure why. The new drive is of course faster, but writing is normally slower than reading. Overall then, over 10 times the space (and perhaps a quicker drive) for £47 (ebuyer).

Result – look at all that space! Now to let GT5 have its way…

Facebook Places – privacy concerns

There has been a fair bit of “concern” in the media now that Facebook have launched their new Places feature in the UK. Some of these concerns relate to privacy, some escalate that to safety. “Facebook has gone too far” they cry, “now everyone knows where you are”. Hmm.

I would suggest that Places hasn’t really changed anything at all. The risks associated with notifying internet users of your location haven’t worsened. As social media becomes more and more mobile and media rich it’s not at all uncommon for a person’s location to be deduced from the content they produce.

Facebook wall updates along the lines of “Enjoying a meal at Prezzo with the girlfriend” have been commonplace for years. Yep, you’re out, and your house is empty.

Twitter is significantly more concerning in my view. A geotagged Twitpic around the home? That’s where you live. Tweeting that you’re about to enjoy a week-long holiday abroad? Bingo.

So here’s the thing. Facebook’s Places hasn’t made anything any worse. Facebook’s privacy controls ensure (by default) that only your Facebook friends will see your Places updates. If you’ve got any sense you’ll keep it that way, and you’ll do the same for all the other content you push to that site, including photos and wall posts.

It’s all too easy to complain. Facebook helps me stay in touch with people who have moved away. Facebook’s Places will help me feel more a part of their lives, and its ‘nearby’ feature might just help me catch up face-to-face more often with those more local friends. There are of course risks with publishing your movements on the internet, but we’ve got controls to minimise that risk should we choose to use it.

Rage Against The Machine make it to Christmas #1

It has been an extremely interesting week for not only the British music industry, but also for the internet and social media. The battle for the UK Christmas number one single has been fought on many topics. I won’t hide my point of view here: I thought the X-Factor’s offering was painful at best, and I am delighted to have such an entertaining song at number one from RATM, but I would like to focus on how it happened.

Tracy and Jon Morter, a couple that I’d never heard of before, started a Facebook group originally titled “Rage against the x-factor“. It strived to get members to buy RATM’s “Killing In The Name“, in the hope that it might outsell the X-Factor’s single. Given that the power of traditional broadcast media and interruption marketing has meant that every X-Factor winner from 2005-2008 had been number one, this seemed like quite a task. Yet, although it was close, that campaign succeeded.

As I write, that group has just shy of 1,000,000 members. The initial growth was completely viral. Tracy and Jon invited their friends, who invited their friends, et cetera. This wave swept through my Facebook account. I can’t remember which of my friends appeared in my news feed as having joined the group, but it was enough to convince me, which means my other friends may have also seen it in their news feed, and so on.

Twitter became a vital key, attracting the attention of the likes of Stephen Fry and Bill Bailey. With retweets from celebrities came a massive following, with #ratm4xmas trending, and eventually crunch point: media attention.

Suddenly Jon Morter was being interviewed live by Jo Whiley on Radio One. Rage Against The Machine were interviewed and played live on Radio 5 – predictably ignoring their promise to keep the version clean – which of course attracted yet more attention.

Zach de la Rocha, Rage Against The Machine’s lead singer, was interviewed when their number one success was announced, and said something quite inoffensive that raised my eyebrows. He referred to the “UK kids” having “spoken”.

Here, I think he’s got it all wrong. Facebook’s insights would be able to confirm. If we consider kids to be those under 18, I wonder how many of them changed their usual music buying patterns as a result of this campaign. Aged 30, I am confident that I and my peers changed ours a lot. Prior to this week, I hadn’t bought a music single in over a decade. Why would I? Albums, yes. Yet for everything else there’s the likes of Spotify and Last FM.

Social Media has torn traditional marketing apart. It has reached to all ages. It got people’s attention when it suited them – when they checked Facebook or their Twitter feed. Traditional marketing relies on people observing bus shelter adverts, wanting to listen to radio DJs, wanting to buy a magazine or newspaper to see the adverts. In this case, it also relied, heaven forbid, on people wanting to spend their Saturday evening watching the X-Factor.

Yet Social Media marketing arrives as a message from your friends, when you decide you want to see it. Your friends have already done the research, they’ve shown an interest, and maybe you would like to as well. Simon Cowell and the X-Factor will be back – presumably at number one next week. This campaign has of course been a flash in the pan, but for Social Media, it is perhaps a coming of age. It’s a marketing tool that can be ignored no longer.

Competitive England soccer match on internet only

England’s soccer fan-base is still rocking from the news that the next competitive match, this Saturday, will not be available to view on the television. Due to the collapse of Setanta, the rights for the match against the Ukraine have been snapped up by a firm called Perform, who will be streaming their live coverage to a million viewers on the internet.

The question is, has this really been thought through? I’m a big fan of internet technologies, and I’m absolutely subscribed to the idea that computers will provide to gateway to future TV style entertainment. BBC’s iPlayer concept is fantastic. I’m aware that Channel 4 got there first, but the BBC now have a significantly more advanced product, and their commitment to formats such as the PS3 has got me hooked. I watch much of my TV just like this – PS3 connected to TV:


Catching up on Question Time is however a completely different kettle of fish to watching an England match live. I have visited and checked the ‘HD’ stream. I have a number of issues with the concept:

1) Sport is especially good in HD, and is certainly best on a big screen. This ‘HD’ test stream was about 50% of the size of my 720p display. That’s not HD. 1080p is HD. This is significantly worse than standard telly.

2) Internet video streaming is still a bit ropey. The PS3 is hard wired to my good 20mb network, but I don’t trust it with something like a live competitive sport. If newsnight fails to stream, it’s no biggie. If I’ve got a load of mates around to watch the footie and the feed fails, it matters.

3) I’m a long way from being convinced by England’s broadband capacity. I figure that on Virgin Media’s fibre I stand a pretty good chance compared to those on traditional copper fed DSL, but in both cases, how can we be sure that when it comes to the crunch, the transfer capacity will be there? Gloomy autumnal Saturday afternoons are peak internet traffic zones – add the significant weight of 1,000,000 users, many of whom wouldn’t normally load the internet much at all, loading up on video streams, and I think we’ll hit our biggest contention problem to date.

4) This video stream isn’t technically permitted in pubs. Pubs aren’t well known for internet savvy landlords and big internet connections, so even if it were permitted it would present issues. I know that some pubs will acquire potentially dubious foreign satellite feeds, and while this may not be entirely legal, it makes a lot of sense. For all the reasons listed above, a landlord needs to do whatever it takes to keep punters happy.

5) There will certainly be a lack of community spirit about these matches. The very fact that pubs shouldn’t be showing it means that social football viewing will be decimated, but equally, not many homes have the capability to show internet video on a big screen. Are fans supposed to crowd around tiny computer screens to catch the atmosphere? And worse still, if you’ve got a big computer monitor like my 24″ Dell, at 1920×1200 the apparently ‘HD’ stream looks nothing short of revolting.

All this said, I’ll go into this with an open mind. I’m going to get some friends round, and hope that all the technology works. I’ll need to log in to the site on my PS3 with the details I purchased earlier in the week. The provider’s server will need to be able to support 1,000,000 streams. My internet connection will need to hold up for 2 45 minute periods. The quality of the stream will need to be good enough to reveal the sport correctly when upscaled to a 40″ 720p screen.

This isn’t a big ask. I can watch some premier league matches on ESPN HD on real 1080p HD at no extra cost, where the image is pin sharp. I predict a bit of a fail here, but I’ll let you know. I am certain about one thing though: I’m glad this match isn’t crucial, and that this effective trial will be out of the way before the World Cup Finals. I believe this is the future, but we’re simply not technically or socially ready for it yet.

My TV is poorly

We’ve got a 40″ LCD TV made by Humax. We paid about £500 for it in early 2007, and up until recently I’ve been really very pleased with it. My mind has been changed because the remote control was becoming less and less responsive, and now it does nothing at all.

TV on

I had dismissed issues such as flat batteries and dirty lenses on transmitters and receivers, and so turned to the internet for help. Here’s an interesting way of telling whether your remote control is broadcasting anything at all. This is a photo taken with my SLR, of my point-and-shoot camera pointing at the end of the remote control.

Button not pressed:

remote inactive

And now when the button is pressed:

remote active

It’s faint, but you should be able to see a dim blue light in there. Due to this flickering it’s very hard to take a photo, but when looking at the point-and-shoot’s screen with the naked eye, the flicker is very bright indeed.

Anyway, this proved that the remote was at least firing. It is possible that it is broken in some way, but I can’t check for that so I’m going to proceed on the basis that the remote is fine. So I turned my attention to the TV.

Getting the TV apart was quite easy. I was generally impressed by the engineering and the way in which it is screwed together.


Eventually I got to the IR board.

IR board

I fired up the TV and tried pointing the remote directly at the receiver in the hope that the TV case was dirty, but no luck. I took the multimeter to the pins on the IR receiver and found 5V, earth, and a third pin that wobbled around 3.5V regardless of remote control interaction – I guess I need an oscilloscope to get any further there. I re-flowed the solder on the IR receiver’s three pins to be sure there were no dry joints, but still no luck.

So at this point I’ve put the TV back together and have sent an email to Humax’s support department to see if I can source a replacement IR board. What’s interesting is that the TV’s buttons are connected to the left of that board, and the cable on the right leads into the TV proper. So as the TV’s buttons work (it’d be useless now without those!) I’d have thought the problem is localised to the IR receiver or the few components around it.

Should be pence to fix really, but without more knowledge and assistance it looks like we will have to buy another TV. In this age of vast consumerism that seems quite a shame.

Anyway, for now the TV at least works. I can certainly use the exercise generated by the lack of remote control. ;)

Fedora 10 – my GUI broke!

Apologies to the petrolheads out there for whom this will be gobbledegook, but I’ll try to make this comprehensible as I hope it will help others travelling the ‘net. My home server appeared to break itself while I was away this weekend. That may seem like an outrageous claim, but really – it was fine, I switched it off, I went away, I switched it on again, and it was broken.

Being a scientific sort, I’d better define broken. It booted, but the graphical user interface (GUI) wouldn’t start. It turns out that last week I applied a yum update (a bit like Windows update) which updated the X server (the graphical interface part of the server). This wasn’t properly tested for a hardware configuration similar to mine, and it broke everything. :(

A bit of internet research soon showed that other people had suffered the same problem, and today a kind soul has posted i386 fix instructions. I’ve adapted these for x86_64 users (like me):

Honk honk! 

Next Page »