Microsoft Acts | People React

As usual, Microsoft has been in the tech press recently (when aren’t they?).

There are a couple of issues that caught my interest:

MS Temp Fired for Blog Contents

According to his own blog, Michael Hanscom was fired from his temp position at Microsoft.

His crime? Taking pics of Apple G5’s being unloaded at the MS Campus and publishing the pics – and the loading dock’s whereabouts – on his blog.

OK, a lot of people seem outraged.

Why?

Mainly because of kneejerk anti-MS feelings, apparently.

This guy – in all innocence, to be fair – took pictures at work, published them on the Web, and disclosed the contents of what was being unloaded, where the dock was, where he worked and so on.

This isn’t good – even if there is a certain paranoia on the MS campus, why would they risk keeping this guy (only a temp, as well)? What will he photography/copy and post next? Code samples, meeting agendas, manager schedules? Sure, that sounds paranoid, but you just don’t out your employer in this manner, unless it’s a matter of public safety (which is why there are whistleblower laws).

Hanscom even tips his hand – in his blog entry – that he felt that pictures and postings could cause issues:

…”when I took the picture, I made sure to stand with my back to the building so that nothing other than the computers and the truck would be shown — no building features, no security measures, and no Microsoft personnel.”

Michael Hanscom

But then he posts the pictures with information about where the dock was and so on.

I’m sorry, I don’t feel too sorry for this guy – he screwed up, and – true – he didn’t mean anything malicious. But he did screw up.

Play you pay…

Microsoft Bets the Company on Longhorn

Even C|Net is getting into the speculation that this latest acknowledgement (not that it’s new info) from Redmond that Longhorn is a “bet the company” move, calling the gambit a

gamble.

I don’t get it.

This is a fulcrum point for MS – they can either (try to) keep selling WinNT-based OSes and virtually identical new editions of Office (is there anything in Word2000 that you cannot live without that’s not in Word95? Not unless you’re trying to used Word as a Quark-substitute).

This is the next stage in the evolution of personal computing, one that actually is predicated on the needs of business – like it or not, the DRM features, new file systems, services support and so on are squarely targeted at businesses.

Because it will make it simpler for Web services to (finally!) become commonplace.

Which is an interesting statement (if true) because that means that Web services won’t become commonplace for another four years or so (Longhorn due sometime in 2006; widespread adoption will take another couple of years after that).

But as far as the gamble MS is taking; I don’t think so. By the time the OS (and support tools, such as Yukon [new SQL Server]) rolls out, businesses will be ready. Businesses have been slow to embrace XP – sticking to 2000 or NT (but support is now gone, since June, I believe). Unless the OS is delayed too much (always a possibility) and businesses finally move to XP, there should be a real need for a new tool.

Especially if the businesses want to jump onto this new-fangled XML/Web Services thingee…

Meme’s the Word

meme – n.

A unit of cultural information, such as a cultural practice or idea, that is transmitted verbally or by repeated action from one mind to another.

– Dictionary.com

Loosely used on the ‘Net (especially in the Blogsphere), a meme is a sort of zeitgeist, something/someone with that intangible buzz. At least that’s how I’m going to refer to memes in this entry.

Today, some obvious memes are Google and the act I’m performing right now – blogging.

But – for reasons best left unwritten (not because I’m hiding anything, but the reasons are…meaningless and pretty darn boring – I’m been thinking about memes lately.

Mainly, I was thinking about memes of the past – things like that.

Expired Memes:

  • Zdnet: Remember Zdnet? Next to C|Net’s news.com, it was my favorite tech news site for a few years. And then C|Net bought it. And it’s been going downhill ever since – a couple of good columnists left, but not much else. And it really doesn’t differentiate itself from news.com, so what’s the purpose? (Yeah, ad dollars..)
  • Jon Katz: Love him or hate him, he has pretty much evaporated ever since he left/was cut from Wired.com – but he was relevant in some fashion for a while. Hell, Slashdot even has preference where you can suppress Jon Katz stories. While newbies probably never heard of him, Katz was a strong voice for some of the Web’s seminal years.
  • Browser Wars: Remember the browser wars? Sure you do… There are actually a new set of browser wars going on, this time not for installed base, but for standards support. It’s not in the regular media much because the fight is different: In the first browser war, MS wanted to own the browser to control the desktop. That didn’t really work out the way anyone thought it would. Today, the browser war is standards bodies and developers crying for standards…and MS doesn’t much care. How does that help them?
  • Netscape: Do I really need to comment?
  • Content is King: While I think the pendulum will, to a degree, swing back to this meme, right now it’s more flash (literally – Macromedia’s Flash) than substance.
  • Webmonkey: Remember when Webmonkey was relevant? A daily must-read? No more. Very sad.

Today’s Memes:

  • Google: While the Google backlash is certainly building and has been noted here, Google is still to search engines what Windows is to OSes – except most consider Google the best engine, while Mac and Linux/*nix users will – and can – present strong arguments for their choices.
  • Blogs: Again, there is a backlash in the works here – and the whole divisive nature of many eminent bloggers/blog tool makers has damaged adoption – but blogs have filled several important voids for many authors and readers:

    • Unbiased voices – single voices making a difference
    • Additional data – for reporters such as Dan Gillmor, blogs offer a way to supplement their stories, publish additional information that would never make it into dead tree publications (for many reasons). This cannot be a bad thing: Hey, don’t care about this extra stuff? Fine. It does not interfere with your print reading and so on. But it’s there if you care.
    • Publication ease – I’ve always maintained that the Internet’s killer app is not the Web, but e-mail. In the same way, blogging – for all its benefits – is (in my mind) most powerful as a simple way for anyone to publish. Sure, MS FrontPage is pretty easy and all that, but one still needs a domain (what’s a domain?), has to sorta understand FTP and so on. Fuggetaboutit. With some blog tools/services, all you need to know is how to use a browser and type. THAT’S damn powerful.
    • New life to the Home Page of yesterday: Today’s blog is yesterday’s My Home Page, to a large degree.

  • Wireless: Not a strong meme, but certainly one that is almost past meme because it’s been adopted so widely. Hell, it’s expected nowadays at tech conferences, and this will bleed over to regular conferences and other areas. Wireless is a stealth meme because there are so few reasons to fight against it. One may consider, for example, a tablet PC to be either an oversized Palm or a crippled laptop. OK. But the argument against wireless is probably only one of two: 1) protocol issues (a, b, g…), or 2) Security (hard-wired more secure than wireless, in general. Beyond that, wireless is a good thing. And these arguments are not Windoze vs. Linux issues. These arguments are for specific instances and are can be easily reconciled.
  • *nix: Linux is a meme in itself, but its also part of a larger meme, which can either be described as a Windoze backlash (in this case, not necessarily a knee-jerk reaction) or as a real trend: People are looking for a stable OS. With hardware becoming a commodity and increasingly powerful, OS software is becoming more interesting to folks. And Linux (stable, cheap, hard) is earning a lot of attention, as is Apple’s OS X – a BSD variant with a solid GUI slapped on top of it. While I run Windows, and will until it’s unnecessary (necesary now because most others do – vox populi standards compliance), I like the concept of Apples OS X – runs the Windows-type programs I need (MS Office, Photoshop) plus has the command-line interface that so many hate but I love. I’m always amazed that all the Linux talk and KDE vs. Gnome GUI flames take place without the explicit declaration that Apple has done what GNU/Linux community (Lindows, Wine…) has been trying to do for years: Rock solid OS with *nix underpinning that has a stable, attractive GUI and runs software people know and love – not just GIMP.

Fall Back

Yes, it’s time for most of the country – except most of Indiana and Arizona, I think – to revert back to Standard Time.

Time to reset all your clocks and change the batteries in the fire/smoke alarms.

I’m always amazed at just how many clocks there are in just my small house/small life:

  • Kitchen: Coffee maker, microwave, wall clock
  • Living rook: Just a wall clokc (Cuckoo clock, if you care..). I never use the VCR anymore, so that’s not touched
  • Bedroom: Couple of alarm clocks
  • Office: Desk clock
  • Bathroom: Wall clock

And this does not count the five computers I have (all currently set to auto change, by the way – Windows and Linux), cell phone (again, auto) and a wristwatch.

And I would say that I don’t have as many clocks as many – No clock in the dining room, none in the basement other than one on the computer there.

We are a time-obsessed society.

Search Me Redux

According to a story that ran in the WSJ (sub required; I won’t link), Amazon’s full-text search (see preceding entry) hasn’t won over one publisher: Tim O’Reilly. (View the TechDirt article.)

This is a little surprising, because O’Reilly is usually in tune with stuff like this – hell, the O’Reilly site has lots of free online chapters of books they sell – an inducement to buy the dead-tree book, of course.

And this is pretty much Amazon’s goal is, I would think (although they probably have loftier goals, as well).

And – interestingly (to me) – O’Reilly is quoted in article as saying, “‘If [Amazon ends] up being a Google for published content…we need to think better about what publishers get out of it.”

Which is pretty much what I alluded to in my last entry.

I wonder what really went on there…it seems like something O’Reilly would be all over.

Search Me

Wow, I was just at Amazon and saw the full-text search it has going now.

Wow.

According to this C|Net article, it currently searches over 33 million pages of text.

Again, wow.

And I don’t think this is the last we will hear about this search. It sounds like a Lexis – for literature – type tool that…well, kind of encroaches upon Google’s turf (or any search engine, but Google currently is the champ).

Going to be interesting.

Picture of the Day

I’ve gotten some feedback on my Pic ‘O the Day feature that I’ve added to the left-hand column, most asking just how I did it.

The assumption is that it’s database driven; it’s not.

Since I am on Blogger, I’m pretty much stuck with a static site that’s written out by Blogger from the database they host and own.

This is bad and good:

  • Bad: I don’t have control over the templates, database and other functions like I do in most of my sites/my other development efforts. I have to funnel all my efforts through the Blogger tool.
  • Good: Since I am at Blogger’s mercy, I have to “roll my own” if I want additional functionality.

An example of such is the RSS (XML) feed that this blog has – it’s a Perl script that runs every five minutes from my home box: Grabs the index pages, parses out the necessary elements (strips HTML..) and writes out and uploads the RSS feed.

Why is this good??

Because I learn from doing this stuff. If Blogger had a built-in RSS feed option, I would definitely use it. They don’t currently have one, so I set one up myself from scratch.

Is it elegant? Nah.

Does it work. Yep.

That’s good.

OK – back to the subject at hand: Picture of the day.

Again, this is a Perl script that I run from home (my host doesn’t allow CRON access…another obstacle!).

Basically, it uses the Net::Telnet package. Using this package, I – though the Perl script – perform the following tasks:

  • “Telnet” to my server and log in
  • Get listing of all JPGs in the full-sized image directory
  • Select one of these images at random
  • Copy the thumbnail and full-sized image that is today’s random picture to “random.jpg” in each directory (full-sized and thumbnail).

That’s it – about 20 lines of code, reproduced below:


#! /usr/bin/perl

$myServer = "[server name]";
$myUsername = "[username]";
$myPassword = "[password]";
$imagesFull = "[path to full images]";

# create telnet object
my ($t);
use Net::Telnet ();
$t = new Net::Telnet;
$t->open($myServer);

## Wait for first prompt and "hit return".
$t->waitfor('/User\@domain:.*$/');
$t->print($myUsername);

## Wait for second prompt and respond with password.
$t->waitfor('/Password.*$/');
$t->print($myPassword);
$t->waitfor('/vde.*$/');

## Read the images in the full-sized image directory, one per line
$t->cmd("cd $imagesFull");
@remote = $t->cmd("ls -1 *.jpg");
pop(@remote); # remove last element; the shell cursor

# get random pic
srand;
$random = $remote[rand @remote];
chomp($random); # remove line feed from STDIN

## copy this file to the "random.jpg" file in the full and thumb dirs
$t->cmd("cp $random random.jpg"); #FULL pic
$t->cmd("cp ../thumb/$random ../thumb/random.jpg"); #THUMB pic

exit;

That’s the hard part: Then I just set a Cron job on my local machine and it fires at the interval I want. (I haven’t firmed this up, but I’ll probably stick to once a day.)

I had originally set this process up with the Net::FTP module (because I had done work with this module before), but this didn’t make a lot of sense – I could easily pull back the directory listing, but FTP doesn’t support remote system copy operations (delete only).

So I initially had a script – that worked fine with using Net::FTP – but that meant I had to find the random image (no biggie), but then I had to download the day’s pic and upload it again with the new name (random.jpg).

For both the full-sized pic and the thumbnail.

Doesn’t make a lot of sense to do four file transfers – and the full-sized images can/could be quite large – when two telnet commands (“copy [pic o day] random.jpg” for each – full and thumbnail – image) will do the same thing!

I knew there had to be a better way.

And I finally (thank god for the Internet & Google!) found the Net::Telnet module.

Installed it from CPAN, and got it up and functioning inside of an hour. I was able to copy a lot of the Net::FTP code (find random image…) right into this new script, and all was well.

One thing I did have to mess around with was the login part – this is not as seamless as the Net::FTP module (though I’m probably missing something).

The telnet script, much more than the FTP script, requires one to have actually done the scripted processes via the command line. Little differences crop up.

For example, the FTP script pulled back – just using “ls [image directory]” all the images into a straight-into-an array manner.

With the telnet script, I had to do a “ls -1 [image directory]” (to get single column listing) and returned all elements with a line feed following (like STDIN). So I had to chomp the selected image to remove this.

In addition, the directory listing – at least on my host – returned, as the last element – the shell cursor (i.e. “[$bash 2.0.1]#” or what have you). So I had to use the pop() command to remove the last element.

I’m not complaining, but it does seem as though the Net::Telnet module is not as generic as the Net:FPT module – or maybe it’s that telnet is not as generic as FTP.

Whatever. It’s done. Little bit of work (a couple of times) and I learned a bunch.

That’s the bonus of Blogger – you’re forced to learn to advance.

I’m cool with that.

CSS Hell

Don’t get me wrong, I’m a big fan and big supporter of CSS, but sometimes it just seems impossible to get things going the way you want.

As indicated in a earlier entry, I’ve been messing with ImageMagick and adding some pictures to this blog. As part of this process, I did a slight redesign of the left-hand column…and all hell broke loose.

I don’t think it’s CSS’s fault – it’s mainly a problem with implementation: I can get it to work perfectly in IE with one set of code, perfectly in Netscrape with a similar – but different – set of code.

And – to be honest – when it comes to positioning and all that, I’ve a lot of experience but I’m not certain which set of code is the W3C compliant code for what I’m trying to do.

If either set is.

Very frustrating.

Dave Winer writes frequently on this subject, and while he is a little too negative for my taste on CSS, I think he has a point.

Usually, he’ll be going through something like I went through yesterday and it just won’t work (across browsers). While someone will usually take the challange and produce the code to make what is spozed to happen happen, it’s not intuitive or sensical. There are often lots of hacks necessary.

This is not good, and it’s frustrating to Dave because he is a programmer. For programmers, while there may be many ways to do the same thing – generate a random number, for example – and many tools in which to do so (PHP, Perl, Python, C, C++…..), at the end of the day you’ll have a routine that will generate a pretty good random number.

With CSS, it’s almost a crapshoot.

For fonts and such, it rocks and is very stable. The sizing issues, and other browser-specific differences are still evident, but this is more of a variance of appearance issue, not a completely different appearance issue, the way positioning is plagued.

But CSS support is getting better in browsers, but that’s still frustrating: While CSS is now widely supported, not all of CSS is widely supported, and in a consistent manner.

While things have improved, we’re still mired – to some degree – in the Browser Wars. Except now the war is not over installed base (IE won dudes; get over it), but CSS support.

One step at a time…

Comcast Blues

Maybe it’s just me, but an Internet provider that does not support pings or traceroute is problematic.

I’ve had my Comcast cable account for over three years – it’s been though three or four owners (I think I first got it with Excite@home before they did their incredible we-won’t-sell belly-up), and – for the most part – I’ve been very happy with it. It’s quite zippy (though they have capped the upload speed, which is a handicap for a developer like me), and the outages have been very infrequent and – with one exception (over 24 hours) – pretty short.

I’ve generally been satisfied; I’ve recommended it to many other friends/associated.

But – for whatever reason – my ability to either ping or traceroute past the Comcast gateway has evaporated over the past couple of months.

This isn’t just a brief outage, the ports or protocol (ICMP) appears to be blocking pings and traceroutes (outgoing). I’m talking all the time for the past 2+ months.

And the most frustrating part of all this is the customer support – it really is horrible. I have opened four tickets on this single issue, three have been closed with an “it must be your system” statement with no change in my inability to use these base Internet tools.

Hmm…it’s my system??

  • OK, I do have a home network. So let’s plug the cable directly into any of the four boxes I have in the office. Even after the recommened recycling of box and modem, same lack of functionality.
  • Comcast has said my OS is the problem, and I should call Microsoft. Uh…
    • Which of the three OSes that I running is the problem (Win2000, WinNT, Linux 7.3)? All three?
    • So I should ask MS about the issues I’m having with my Linux box?
    • When I use dial up on each of the machines (to a different ISP), all is well?
    • If it’s my machine, how come I can ping or traceroute to the Comcast gateway (not in my house/on my property)…but it dies there? That means it’s outside my equipement, ja?

  • In response to direct questions, Comcast is unwilling to say yes|no they do|don’t support ping/traceroute.
  • In Comcast’s help forums, Comcast techs have posted messages asking for users to post their trace routes so they can see how an upgrade went…uh, I can’t do that….
  • Comcast has consistently insisted that they are not blocking any ports. OK. Then explain what’s happening here.

But I vent.

Just had to.

The sad part is that Comcast has been, overall, very good.

But this is a big negative, and the customer support has been downright rude and – thus far – unable to even acknowledge that this is an issue (again, they won’t say is isn’t a problem, either), much less give me some resolution.

Oh well…let’s hope that fourth ticket will get the job done. Sure, that’s the ticket!

The Magick (sic) Continues

My last entry talked about how I was finally getting around to learning ImageMagick so I could automate some image processing.

The madness continues!

A little bit of history is probably in order:

  • My first career was as a photographer; I did it for approximately a decade. I still love photography, and – trust me – I have thousands of pictures laying around the house. You have been warned.
  • My second career was as a writer/journalist, and during that period, I was always the “geek” of the writing staff (or part of that sad-sack club). I did production work – desktop publishing (QuarkXpress, Photoshop, Illustrator and so on) and saw the promise of digital photography before it really happened.
  • Current career – computer dork – dovetails nicely with the first two when it comes to graphics and such. Which is why the madness continues!

So ImageMagick has been for my Inner Geek and my Inner Artist (actually, my Inner Geek is more of an outie…).

I’ve fired up the old scanner and have spent some time with it today. It’s been fun.

Next steps: Batch processing with error handling; gallery construction.

The first step is just coding – I’ll get it to do what I want, I’m certain of that.

The second step is more … uh, interesting.

Because I’m using Blogger, and so they own the database.

I’m going to have to figure a system that will publish galleries locally and then push them to this site.

Again, doable.

But … in what way? There are a million (ok, more than two) ways to do this, so which path do I take?

That’s the frustrating, rip up the code/rip out the ethernet cable part.

Also the fun part.

Again, my Inner Geek is showing…