So
I had to chuckle at this article wherein IBM seems vexed that the number of computer science and IT graduates is declining in the USA. Really. IBM is probably one of the IT companies that led the charge to offshore jobs and slash US IT positions.
And they wonder why IT is not as attractive an option for college students? They have already sent the message that 'cheap' is what they want; not homegrown (or even good, for that matter).
So
In reading Bruce Schneier's post When the Internet Is My Hard Drive, Should I Trust Third Parties? I was struck with the thought that things are necessarily as bad as he makes them out to be. Using sites like ma.gnolia, diigo and even Google Notebook it is possible to save not only a link but also the actual content of the link. So if the original source goes away, you still have the content.
Of course, you are putting your trust in the fact that ma.gnolia itself is not going to go away and take all of your archived content with it. But if you look beyond web-based tools to something like Soho Notes, you can clip and save (and backup) web content to your local drive all you want and reduce your chances of being a victim of link-rot. Having a synchronization mechanism between online and local (or mobile for that matter) data would improve the survivability of the data.
I would still like to see this problem 'solved' by implementing what I call Ubiquitous Data.
So
Depending on how you do the math, either yesterday or March 31st is the 10th Anniversary of Mozilla. In a way, it doesn't seem that long ago. And in thinking about it, it hasn't been ten years, because the real game changer didn't get started until six years later (in 2004) when the first release of Firefox arrived on the scene.
Since then, Firefox has delivered a little over three years of innovation and improvements. More than can be said for the stale, outdated default provided by a large, malicious corporation. It will be interesting to see what Firefox is delivering as it reaches it's tenth. The inclusion in the next release of semantic web awareness is (to me anyway) a sign of good things to come.
So
So much for my previous prediction that OS X 10.5.2 and Aperture 2.0 would surface during PMA. But around a week and a half later they are both a reality — 10.5.2 appeared in Software Update last night and this morning Aperture 2.0 was announced ($99 upgrade, $199 full). There is also a 30 day trial of 2.0 available on the Aperture site, so if you can't wait (or are curious) you can have a look today.
So
Reading through this post on The Key Difference Between Developer and Architect Roles I was reminded of a few other key attributes that successful architects possess that developers and (certainly not ex-consulting firm wanks) tend to not have.
Once upon a time I was an architect working on an large packaged application installation along with two ex-consulting types. These guys had zero technical background and were basically good for creating and following task lists with no understanding of what the tasks were (or could be). Any conversation with them ended with them drolly replying 'well that's nice but it's not in scope'. Problem is that if they had a modicum of technical/architectural skill, they would have recognized that every suggestion was in scope and had the recommendations been acted on would have saved the project enormous amounts of time and money.
For example, their task list said that they should rubber stamp the scripts they had for data transformation and movement. Well, in the ten years since the original scripts had been written, the company had acquired an ETL tool that would have made creating, modifying and maintaining the data movement portions much easier and quicker. But, no, that was 'out of scope'. The 'task list architects' spent something like 700x the estimate for the ETL effort to essentially build a hairball-esque shell script-based hack that failed miserably. The team spent huge amounts of time and effort trying to maintain the scripts. On top of that, they had huge data consistency issues because the scripts barely worked in one scenario let alone have the flexibility to accommodate new requirements.
That was just one of their many 'successes' on the project. They basically did the same with the reporting for the system. Rather than use the 'out of scope' modern BI tools, they 're-used' the 10 year old scripting hacks. Another huge dose of fail. And again with environment (mis)management. Somehow through their utter ineptness they 'required' something like 39 copies of the production environment to complete their testing. Thirty nine. The mind boggles.
But this is what you get when people who can barely write a requirements document (but have 'experience' from big consulting) adopt the title of 'architect'. Real architecture requires enough vision and understanding to know when to make both strategic and tactical decisions that enable a project to deliver a quality result. Real architects understand what changes can be made and why, without greatly (if at all) effecting scope. Task list 'architects' can't see beyond their own tick lists.
So
Finally (well from 2005), a study from MIT on the Effectiveness of Aluminum Foil Hats. Worth reading the whole thing to get a sense of the detail involved, but it's all right there in the abstract :)
Among a fringe community of paranoids, aluminum helmets serve as the protective measure of choice against invasive radio signals. We investigate the efficacy of three aluminum helmet designs on a sample group of four individuals. Using a $250,000 network analyser, we find that although on average all helmets attenuate invasive radio frequencies in either directions (either emanating from an outside source, or emanating from the cranium of the subject), certain frequencies are in fact greatly amplified. These amplified frequencies coincide with radio bands reserved for government use according to the Federal Communication Commission (FCC). Statistical evidence suggests the use of helmets may in fact enhance the government's invasive abilities. We speculate that the government may in fact have started the helmet craze for this reason.
So
Britannica blog's posting on The Problem of Data Storage points to how complicated archiving and retrieving things in the digital age has become. Before it was enough to preserve tablets or paper — now this is greatly complicated by the various digital formats that house our data and how the formats themselves are subject to disuse (in many cases making the enclosed data unavailable as well). PDF is a glimmer of hope; we shall see how it holds up to the test of time.
As anyone who has tried to migrate data from an ancient floppy can tell you, retrieving that information, though only 25 years old, is no easy task. (The floppy disk itself is a nearly extinct medium, for that matter.) The mere difficulty of retrieving old data provides the rationale for Adobe’s now-standard PDF (portable document format), documents that can be read and printed across any operating system. What is more, Adobe developers maintain, “ten years from now, and into the future, users will still be able to view the file exactly as it was created”—meaning that fonts, layout, and illustrations are locked into the document and cannot easily be changed, unlike documents created with standard word processing software. (For more, see Adobe’s white paper “PDF as a Standard for Archiving.”)
On a larger scale this reminds me of the excellent book The Clock Of The Long Now by Steward Brand, that covers designing (and documenting) a clock that works on a massive scale and is intended to run for thousands of years.