Spending Friday night at an installfest for a beginners Python workshop... Good turnout, much geekery.
Friday, June 22, 2012
Sunday, November 22, 2009
Current reading list
Recent reading:
RFC 4880, OpenPGP Message Format
RFC 1951, DEFLATE Compressed Data Format Specification
All-Pairs Shortest Path with Floyd-Warshall
Yes, I am a geek. I'm on break. What's your point?
Thursday, November 12, 2009
My PGP public key
Cleaning some stuff up, realized I hadn't made this available... which rather defeats the purpose, now doesn't it?
-----BEGIN PGP PUBLIC KEY BLOCK-----
Version: GnuPG v2.0.12 (MingW32)
mQENBErznooBCADdHtfLKqscD9MyyVFH70owBwVedCIvk12RBQnPvom45HkKY7SV
MfCX+8R5hY4dFvqfnmp/2lBtU0
S/t3h9mGe1IEmEAQdXtDPvcDo8
vDr1ddH+nRw3lq/RCwlIZ05Asa
FY33bx95GMq/1ThMqxHVEfIk7E
aWFua2hhcmVAZ21haWwuY29tPo
CQoLBBYCAwECHgECF4AACgkQ1B
e9pGoyJ0Z8/sePtmnacrpy/uwP
CQeNObq7c3WfuWwnv5NN+PL9t4
FfJQwcV+ruSQ+0OI1l/CtGxZEm
30HB1iOHfhaSvCCfjhYxBQW4tU
VhAVl1zrbG1luT57AjLkpSC85M
DQRK856KAQgA13YfESgnrPxcLV
+CELSbk0sUmyPOfKm5XtSiGukJ
JYvzp+i4Ix9Gd1HE6nL+Thqxyn
lzcWnKWzY7XsGw7/57OvL/PUhU
3H0msP3ORL/Yo5qcpjTl1gBaYj
J0czPqpsODfNwr80wt7AI86xDB
AAoJENQTwwjxadLdMXMIAIb6Ey
tsV91/SJUVOYGL5LOiZU1w3RW2
WmgL0xtWtjTAd73X3y7sgy//ly
FeNPlg1mcE9HgApbFXeSa/yd8m
Wk3VkZ2Aygw8+Cs9y/8pfFKIFI
gV8RQ971C3LXMaTbyIeFztT1od
=jrjI
-----END PGP PUBLIC KEY BLOCK-----
Friday, September 11, 2009
About Time
The British government has issued an apology for its treatment (and hounding to the point of suicide) of Alan Turing, rightly regarded as the father of computer science. His leadership of the codebreakers of Bletchley Park shortened WWII by at least 2 years--at one point, Churchill was reading field reports from Wermacht officers before Hitler was. He demonstrated that the Halting Problem was undecidable; that is, it has no general solution applying to all programs. When he was burglarized and blackmailed by an ex-lover, he reported it to the police...and found himself on trial for gross indecency, his security clearance revoked (and thus his career in cryptography destroyed), his career over, forced to undergo estrogen treatments, which were known to have numerous side effects, including depression. He committed suicide at age 41.
Turing, I suspect, would have been appalled at the idea of being any sort of martyr for gay rights. And yet his career and life were cut short, and the world deprived of at least 20 years of a brilliant researcher, because of homophobia.
It's nice to see the British government recognizing, finally, some of the injustice that it perpetrated and condoned for so long.
Thursday, September 3, 2009
Evolution, Purpose, & Bad Arguments
I'm just now getting a chance to write about this post yesterday by Jim Manzi, who's filling in at Andrew Sullivan's, on The Evolution of God and Manzi's defense (or at least argument for plausibility of) the thesis of divine purpose behind evolution.
On the most basic level, he's right, of course. The existence of God can't be disproven any more than Bertrand Russel's Teapot can be disproven. But Manzi claims something else; or rather, implies it rather than making an explicit claim.
Manzi begins with an explanation of genetic algorithms as a way of introducing concepts of genetics, including the idea of using evolutionary functions as a way of optimizing on multiple dimensions at once. And while I'd have a minor quibble or two with his wording, his explanation is correct; he clearly gets it.
However. He observes that genetic algorithms are used to find the best combination of 'genes' (which may be data inputs or weights, settings on controls...almost anything, really) for a given purpose. Yes, that's true, according to some externally defined fitness function, a way of boiling everything down to a definitive way of determining, for any two arbitrary members of the population, that this one is a better fit than that one.
In biological systems, of course, the "fitness function" is whether or not the organism survives long enough to breed and for its offspring to reach breeding age. (If it gives birth to a hundred young, and eats 99 of them, that's less fitness than one that gives birth to 3 offspring and carefully nurtures and protects them to adulthood.)
Manzi claims, correctly, that genetic algorithms can select the "best" combination. But: In biological systems, the real world, the "best" combination is determined by survival. This can include high-level activities, especially in social species--if I'm a good neighbor, my children will be taken in and raised even if I'm eaten by a bear.
Manzi seems to be making the implicit assumption that if there is a purpose to it all, the purpose must be to develop a mind capable of apprehending God.
(Which is terribly anthro-centric... Even if there is a purpose, why should it have anything to do with us? Why shouldn't the purpose of creation be to develop the perfect jellyfish? Using Manzi's own logic, this hypothesis can't be disproven.)
But evolution doesn't select for apprehension of the Divine, it selects for survival. Therefore, unless such apprehension, or its precursors, conferred a survival advantage on our ancestors, evolution wouldn't select for it. Is there any evidence that apprehension of the Divine, or its precursors, increased survival? Not that I'm aware of. Yes, religious belief is widespread, and was so in the ancient world, but that doesn't mean it's genetic. A study mentioned in Newsweek a few weeks back found that religious beliefs plummets in advanced societies with low levels of social dysfunction (poverty, crime, etc). If it was genetic, it wouldn't fade out so quickly. On the other hand, social dysfunction was ripe in the ancient world.
(Yes, this is the 'opiate of the masses' theory. Or, if you prefer, the "give me sense of being in control or at least a way to tell myself that someone's in charge" theory. Something can be universal or nearly so, and not biologically determined, if the environmental conditions that foster it are universal as well.)
So. If there's a purpose to it all, evolution probably isn't selecting for it. If you remove the assumption that if there's a purpose to it all then obviously the purpose of all creation is to result in us, you're left with...not much.
(Contrary to what some seem to think, evolution doesn't result in an "ascent" toward higher species... that sort of categorization is something popularizers put on it, but actual people working in the field don't. It's more like a bush branching out in all directions at once, not a tree climbing higher and higher.)
Also, a curmudgeonly point but one that needs consideration: We've existed as a species only about a million years or so, hardly a blink of an eye in geological time, and much less than many other species that were once dominant. Given how rapidly we seem to be making the planet uninhabitable and that we may yet succeed in wiping ourselves out, I'd say the long-term survival value of intelligence is still something that awaits definitive demonstration.
At any rate, Manzi seems to have some recognition of the difficulty he's in here. He then falls back on some unseemly hand-waving, simply declaring it out of bounds for science, in bounds for philosophy, and therefore he'll believe whatever he pleases:
A scientific theory is a falsifiable rule that relates cause to effect. If you push the chain of causality back far enough, you either find yourself more or less right back where Aristotle was more than 2,000 years ago in stating his view that any conception of any chain of cause-and-effect must ultimately begin with an Uncaused Cause, or just accept the problem of infinite regress. No matter how far science advances, an explanation of ultimate origins seems always to remain a non-scientific question.That's right, there's nothing for Science to do but throw up its hands and believe in teapots!
Now consider the relationship of the second observation to the problem of final cause. The factory GA, as we saw, had a goal. Evolution in nature is more complicated — but the complications don’t mean that the process is goalless, just that determining this goal would be so incomprehensibly hard that in practice it falls into the realm of philosophy rather than science. Science can not tell us whether or not evolution through natural selection has some final cause or not; if we believe, for some non-scientific reason, that evolution has a goal, then science can not, as of now, tell what that goal might be.
The combination of a constantly changing fitness landscape and an extraordinarily large number of possible genomes means that scientists appropriately proceed as if evolution were goalless, but from a philosophical perspective a goal may remain present in principle.And having decided that it "may remain present in principle," Manzi assumes it into being. But he runs into other problems as well.
But in fact, even the “random” elements of evolution that influence the path it takes toward its goal — for example, mutation and crossover — are really pseudo-random. For example, if a specific mutation is caused by radiation hitting a nucleotide, both the radiation and its effect on the nucleotide are governed by normal physical laws.He's apparently never heard of quantum physics. Those "normal" physical laws include fundamental limits on what's knowable or predictable. We don't call them random, especially an event such as a nucleus ejecting a beta particle, i.e. a bit of radiation-- just because they're too insanely complicated for us to understand--they really are random. This has been taught in any college-level physics course for decades. He seems to admit this a paragraph later, but obviously hasn't thought through what it implies. Yes, statistically we can talk about half-life, which is an average, because we don't need to know which precise nucleus broke down--but when a phosphorus atom inside a DNA strand decays and is suddenly something else, it does matter which precise atom decayed, and which direction the beta particle was heading.
The theory of evolution, then, has not eliminated the problems of ultimate origins and ultimate purpose with respect to the development of organisms; it has ignored them. These problems are defined as non-scientific questions, not because we don’t care about the answers, but because attempting to solve them would impede practical progress. Accepting evolution, therefore, requires neither the denial of a Creator nor the loss of the idea of ultimate purpose. It resolves neither issue for us one way or the other. The field of philosophical speculation that does not contradict any valid scientific findings is much wider open to Wright than Coyne is willing to accept.Well, only by shifting definitions enough to keep them that way. Of course, the beauty of this particular dog-and-pony show is that the goalposts can always be moved. If we work out, for example, why pi has to have the exact value it does or why our physical laws are the way they are--perhaps no other set of laws would be stable--Manzi can always say "Yeah, but why is that?" and then declare that to be the 'ultimate' question. And when it's answered, he can find another. The problem isn't that "we don't care about the answers," it's that the question is poorly posed.
Might there be a purpose to it all, even if there's no evidence of such purpose? Sure. It can't be disproven. Absence of evidence isn't evidence of absence. But that's no reason to take it any more seriously than hen's teeth or orbital teapots. If someone wishes to believe there's a purpose to the entire universe, and they're it, that's their business, but "you can't prove it isn't!" is also no proof that it is. If you're claiming such a purpose exists, in the absence of any evidence, then you need to produce such evidence, or solid reasons why the lack of evidence doesn't matter. Unfortunately, Manzi has done neither.
Tags: computing, philosophy, religion, science
Friday, July 10, 2009
The Google Chrome OS
As the point of this blog is allegedly tech, maybe I should talk about tech once in a while...
There's an article on Slate discussing why the new Google Chrome OS is doomed before it ever gets out of the gate. I'm not sure I'd write it off just yet, and some of Farhad Manjoo's reasoning is just silly, but all in all it does look as if the odds are against it. Dealing with the 5 reasons why it's a bad idea, in order:
Linux is hard to love. This part of Manjoo's argument is the weakest, because he's cherry-picking his facts. Yes, we know, MS Office (Word, Excel, etc) won't run on Linux. Has he heard of Open Office? Of the GIMP for image editing? Apparently not. Are there features of the MS products that the Linux counterparts don't support? Not that he mentions... Apparently if it doesn't have the MS stamp on it, it doesn't exist.
Likewise, his argument about hardware installation is weaker than it first appears, mostly because he's apparently unaware of the progress that's been made in making Linux more user-friendly. The vast majority of hardware issues can be resolved much more easily than his example.
But. There's that last 5% of hardware issues that are tricky, or that involve multiple steps. There's less hand-holding in the Linux world, and a general assumption that you're willing to learn a little something about the computer you're using. Those who want to pay a couple of thousand dollars for a computer and then specifically avoid learning anything about it are advised to give their money to Steve Jobs, buy an Apple, and pat themselves on the back for being precious unique snowflakes.
There's also the issue of the entire ecosystem. I still use Windows. I could make the jump to Linux for 95% of what I do. I'm not worried about the oh-so-terrifying setup process (my last Linux install went smoother than any Windows installation has ever gone for me) or that I might have to configure something myself. But it's the last 5% of apps, the occasional game, the tool that I need to use because my workplace has standardized on it and while there are plenty of Linux tools that do the same thing, they're not file-system-level compatible... and it doesn't make sense to have a second computer just for those few things. I suspect I'm not the only one. And this is the biggest hurdle Linux (or Chrome) is going to have. I can use Open Office for writing and spreadsheets, GIMP for image editing, GnuCash instead of Quicken for managing my finances.... but if that last 5% of software is my favorite game, or a business-critical application, then I'm locked in to Windows no matter how much I loathe parts of it.
We aren't ready to run everything on the web. Again, an argument that gets weaker by the day as Web options develop. But this section contains what is actually Manjoo's strongest argument--that once everything moves to the web, it doesn't matter what OS the user is running anyway. That's the entire point of moving to the web, to do everything through that interface so it works the same way in any standards-compliant browser. (Which rules out Microsoft, but they're making progress... oh darn, there goes that snark again.) So if you're using Windows, or a Mac, or Fred's Kustom Home-Made Hand-Carved All-Natural OS, and are satisfied with it...why switch to Chrome?
Microsoft is a formidable opponent. Yes, indeed. As Tolkien put it, "It does not do to leave a dragon out of your plans, if he lives nearby." MS isn't a particularly innovative company and hasn't been for years. They wait until someone else demonstrates that a product can make it, and what features are necessary to gain popularity--then either buy it up or brew their own. And they have lots of smart people working for them, and the deep pockets to keep working on it until they get it right, generally around version 3 or 4. At that point, the second-to-none MS marketing team goes to work, and the competition gets ulcers.
Google Fails Often. I'm not sure what to make of this argument, which seems to boil down to "Google tries lots of things and not all of them work." Well, yes. The same could be said of a certain company in Redmond. And the small market share for the Chrome browser isn't necessarily an indicator that it's a failure, given its short time on the market and out of beta. Firefox demonstrated that if you do something better than MS, you can take share away from them. And that giving Microsoft some competition spurs it on to more innovation itself. (Development on Internet Explorer had essentially stopped until Firefox started taking market share away and goading MS into doing something about it. Monopolies lead to higher prices, slower innovation, and poorer quality. That iron law of economics has not been repealed for the benefit of the Redmond Behemoth.) Most companies fail often. Few have deep enough pockets to survive more than 2 or 3 such failures. Google can, so it can seem like a long record of failure. In fact, it's very difficult to predict what's going to take off, what's going to be popular, etc. So they throw everything at the wall & see what sticks. Failure is part of the process. Thomas Edison said some very similar things; most of what he did failed. The few successes made up for it.
Chrome makes no business sense. Again, he raises a point that I wonder if Google had considered. What's the business case? Attracting people to Google so they'll use Google's tools? They could do that in Windows. Push costs down to help grow the netbook market? Microsoft has shown it's willing to cut incredible deals & take a loss to keep its hegemony. It's hard to see how this leads to increased share of an existing market or substantial growth in an emerging market that Google wants to be a part of.
So, bottom line. Doomed? Not necessarily. Manjood overstates his case a bit. But the odds are against it. Given some time and some luck, it may become a niche player. Of course, at one time, that's what Microsoft was, too....
Monday, June 1, 2009
On "irrational numbers" and innumerate bloggers
Warning: Extreme nerd-dom ahead.
So there's a post at the Atlantic about the government's ownership of 60% of GM, taking on the ludicrous claim that the government now owns a large swath of corporate America. As the post correctly points out, the actual ownership of private companies by the federal government is substantially under 1/10 of 1%, which is hardly a socialist paradise. However, the reporter completely blows it with this discussion of the (admittedly rather fun) pie chart produced by Excel:
What I do see is that Microsoft Excel feels the need to portray the percentage of American companies owned by the government as an irrational number. That's 5.07e^-02, or %0.0507 of American companies that are owned by the United States. (When I ask Excel to display this breakdown in real numbers it just becomes "100%" and "0%.")Um, no. Wrong. That's not an irrational number. It's scientific notation. An irrational number is a number that can't be expressed as a ratio of two whole numbers. It has nothing to do with whether it's written in a mantissa-exponent form (as in 5.07 * 10^-2). Scientific notation is handy for dealing with very large or very small numbers, and yes, Excel would round it to 0%, unless you asked Excel to display things out to some fixed number of decimal places.
Which is simple to do, as is suppressing scientific notation in the first place. Either one takes about 4 clicks of the mouse.
Really, people who do business reporting should be familiar with the basic functions of spreadsheet software, and if you're going to complain about something being a particular type of number, you should have some idea what you're talking about.
Update: After numerous comments in the comments section, it's been fixed so that instead of "irrational number" it says "exponential notation." All is well.
Tuesday, July 8, 2008
Still a few bugs in the system
It seems the face-recognition technology used to verify the age of cigarette vending-machine users in Japan still has a few bugs.
The facial recognition problem is hard, and trying to use specific features to estimate age for a generic face is even harder. Then ruling out doctored photos, makeup tricks, etc., and it gets exponentially harder.
So kudos for trying, and it's not surprising there are problems. I'm just surprised it's being deployed at this stage.
[H/T: Andrew Sullivan]
Saturday, July 5, 2008
Missing the point
Newsweek tries to be cutting-edge by suggesting the Internet is the new sweatshop.
They may be teenagers posting videos of themselves dancing like Soulja Boy, programmers messing around with Twitter's tools to create cool new applications or aspiring game developers who want to create the next big thing. But what they all have in common is a somewhat surprising willingness to work for little more than peer recognition and a long shot at 15 seconds of fame.Because, of course, money is the only possible motivation to do anything. If you're not doing it for money, you're being a rube. The enjoyment of doing it? Of making a contribution? Oh pish tush.
Whether these 21st-century worker bees can be said to be having fun (is it really entertaining to update a Wikipedia entry?), there's no question that their moonlighting has value even if they're not being compensated.I don't know if "entertaining" is the right word. "Sorry, gang, I won't be tagging along to the party, I'm going to stay home and update Wikipedia" sounds a little unlikely. But it can be rewarding. Making a contribution, even if "just" for peer recognition (and why is that a bad thing?), has its rewards.
It's also possible N'Gai Croal, the author of the piece, is, like most journalists, innumerate. (I don't know, one way or the other.) Consider that the entire Internet-connected population is currently about one billion. What happens if everyone connected to the internet contributes a half-hour per month of their time doing something (making vids, music mixes, game programming, artwork, Wikipedia edits, whatever) of something that ends up on the internet?
That's five hundred million hours--roughly five Wikipedias--every month.
Someone please send Newsweek a copy of Clay Shirky's article on social surplus, and why we're on the verge of another boom.
But as long as so many of you are willing to work for free, the proprietors of these virtual sweatshops will happily accept.But again, this isn't the point. A half-hour a month, volunteered, is hardly a "sweatshop," no matter how (melo)dramatic an article headline that makes. Yes, there are some who obsessively devote every hour to contribute, just as in the pre-internet age some people obsessed over their bottle-cap collections.
Peer recognition, satisfaction from making a positive contribution, and the pleasures of creating are sufficient motivation. Scaled up, across the internet, enough small contributions add up. And again, Croal fundamentally misunderstands.
Yes, the vast majority of uploaded videos are well below professional quality. Most blogs are barely worth reading. (No comment on whether this one falls within that category.) Sturgeon's Law applies on the internet as well as it does everywhere else. But again, the simple scale of the phenomenon means that the top few percent will be very good.
Newsweek answered its own question as true. In fact, the way they phrased the question shows they don't understand the subject in the first place.
[h/t: Andrew Sullivan]
Friday, July 4, 2008
Privacy and Data Mining, and false positives
James Wimberly has an interesting post up at RBC about the possibility of large-scale data mining as an intelligence-gathering tool, its costs, and benefits. I'm less sanguine about the possibilities of any such large-scale operation than he is, but he's absolutely correct to note that we need to be having this discussion in the open, rather than relying on the "lawless-unitary-executive-knows-best" model that we've been running on so far.
The problem with any such operation is dealing with the false positives, the things the algorithm says are suspicious that turn out to be nothing. Let's say his "third-degree" assumption is roughly accurate and we're targeting about one million people nationwide for data surveillance. Further assume that there are as many as 1,000 truly dangerous terrorist organizers in the US--determined, competent, well-financed. I'm not talking about janitors with fantasies of blowing up airports, I'm talking about people who have figured out how an airport could be sabotaged, and have access to the means to carry it out, and are motivated to do so.
First of all, we note that 1,000 / 1,000,000 = 0.1% of our targets are actually dangerous. The other 999,000 are not dangerous--not motivated, incompetent, don't have the means, whatever.
Now suppose we have a screening method that can detect 95% of the bad guys and screen out 99% of the non-bad-guys. This is, of course, MUCH better than any actual method can do. But run the numbers:
We find 950 out of 1,000 terrorists (true positives), leaving 50 dangerous people at large (false negatives--people we think are harmless, who really aren't).
We also round up 999,000 * 0.01 = 9, 990 people who aren't dangerous but weren't screened out--false positives.
Meaning we round up a total of 9,990 + 950 = 10,940 people, of whom 9,900 (90.49%) aren't dangerous.
This isn't as much of a needle-in-a-haystack problem as we started with, I'll grant. But what happens when we tell investigators to go through a list of people with suspicious data traffic, but to remember most are probably completely innocent?
Well, we know that about 5% of Americans cheat on their taxes. And we know that IRS auditors, who spend all day dealing with tax fraud, estimate that 30% of Americans cheat on their taxes. When you deal with something out of the ordinary all day long, you can forget how out of the ordinary it is. When you deal with tax cheats all day, you tend to overestimate the prevalence of tax cheats.
Good luck getting your investigators going through the list of "data-based suspects" to remember that 90% are probably innocent or harmless, or both.
We can't just sit back and do nothing. But we should also avoid falling into the trap laid out in Yes, Minister:
We must do something.
This is something.
Therefore, we must do this.
Monday, June 30, 2008
"Manners," Microsoft style
Microsoft has filed a patent on something euphemistically called "digital manners," which will restrict the behavior of mobile devices in areas where it's being deployed. No more annoying cell phone calls in the middle of a movie! No more worries about some creep with a camera phone at the gym locker room! What could possibly go wrong?
Of course, it also means your cell phone will politely refuse to take a picture at a concert. Or during a police raid. Or your iPod will politely refuse to transfer music to or from any computer other than yours.
But we're not supposed to notice that. Does anyone really think end users will be able to set "manners" policies? Or that it won't be implemented for the convenience of the media companies?
Tuesday, May 20, 2008
Quote of the day
I have a theory concerning committees. A committee may have different states, like water has gas, liquid or solid phases, depending on temperate and pressure. The same committee, depending on external circumstances of time and pressure will enter well-defined states that determine its effectiveness. If a committee works in a deliberate mode, where issues are freely discussed, objections heard, and consensus is sought, then the committee will make slow progress, but the decisions of the committee will collectively be smarter than its smartest member. However, if a committee refuses to deliberate and instead merely votes on things without discussion, then it will be as dumb as its dumbest members. Voting dulls the edge of expertise. But discussion among experts socializes that expertise. This should be obvious. If you put a bunch of smart people in a room and don't let them think or talk, then don't expect smart things to happen as if the mere exhalation of their breath brings forth improvements to the standard.
Tags: computing, philosophy
Thursday, May 15, 2008
Useful link of the day
How to secure your Windows computer and protect your privacy online.
A very useful compendium aimed at the non-expert, describing the most common threats to security and privacy, and listing a wide variety of software tools, most of them free, for avoiding and fixing problems.
Good stuff.
Thursday, November 8, 2007
Various updates
Took the team to the regional programming contest this past weekend, did fairly well... Useful learning for the programming contest we're hosting in April (that I somehow got put in charge of).
Meanwhile, got Eclipse installed and running on my office machine for Java and C++, and located a decent downloadable C++ reference we can install on the contest machines.
Now we just need to work out the details, such as where we're going to put the contestants...
Tuesday, September 25, 2007
The more things change....
The biggest IT/business trend of the last 20 years, of course, has been the outsourcing of work to India.
Of course, all that demand for programmers in India is driving up wages, and the influx of foreign investment is strengthening the Indian economy and therefore currency. What's an enterprising Indian company to do?
Outsource its outsourcing, of course.
Or, as Ashok Vemuri, an Infosys senior vice president, put it, the future of outsourcing is “to take the work from any part of the world and do it in any part of the world.”And so it goes....
[snip]
Such is the new outsourcing: A company in the United States pays an Indian vendor 7,000 miles away to supply it with Mexican engineers working 150 miles south of the United States border.
Monday, September 3, 2007
They just don't learn
Sony is in trouble, once again, for putting virus-like "security" software on its products and not letting anyone know it's there.
What's interesting is that this "vulnerability" is being compared to the XCD fiasco. That "vulnerability" was a botched implementation that weakened security, was impossible to remove, and--coincidence of coincidences--even though it used many of the same techniques as viruses, leading antivirus software ignored it. The AV companies just said they worked with "industry partners." Uh huh....
At one time, Sony defined cool in consumer electronics. Today, they're not only user-hostile, they're incompetent.
Read Schneier's book. Or Schneier's other book. Security by obscurity is inherently insecure. And in this case, once again, using products that relied upon it put users at risk.
Yes, security is hard to get right. But you'd think after getting burned on a fundamentally flawed approach that outraged users, outraged regulators, and cost the company millions, they'd have learned their lesson.
Apparently not.
Cart Here, Horse There
A NYT article on the evils of AdBlock manages to be insulting while completely missing the point. The problem, you see, is that this evil program actually gives the user some control.
What happens when the advertisements are wiped clean from a Web site? There is a contented feeling similar to what happens when you watch a recorded half-hour network TV show on DVD in 22 minutes, or when a blizzard hits Times Square and for a few hours, the streets are quiet and unhurried, until the plows come to clear away all that white space.
But when a blizzard hits Times Square, the news reports will focus on the millions of dollars of business lost, not the cross-country skiing opportunities gained.
Likewise, in the larger scheme of things, Adblock Plus — while still a niche product for a niche browser — is potentially a huge development in the online world, and not because it simplifies Web sites cluttered with advertisements.
[snip]
[T]he program is an unwelcome arrival after years of worry that there might never be an online advertising business model to support the expense of creating entertainment programming or journalism, or sophisticated search engines, for that matter.
First of all, the purpose of the web, and the internet in general, is not to make money. No matter how many latecomers want it to be.
Second, most online advertisers should be grateful I'm blocking their ads. If I don't see their ads, I don't know anything about them. But seeing their ads gives me an impression of the company, and most of those impressions are overwhelmingly negative. Dancing aliens, jarring flashing colors, suddenly getting a sales pitch blaring over my speakers and having to hunt for the ad that's causing it, then the purposely-obscured mute button on the ad, popovers, scroll-bys...
Look, it's really quite simple. If you go out of your way to annoy me, and I have to go out of my way to shut you up, you're not making me want to buy your product or service. Your crackhead tech-school-dropout web designer may be proud of himself for coming up with code that keeps your ad on top no matter what, but all you're really doing is driving people to seek out ad blocking software and to avoid you entirely.
And as for the websites that block FireFox entirely...well, again, I think we should thank them for self-identifying about their priorities, so a boycott is simplified. FireFox offers a valuable, almost uniquely valuable, service. Very few websites do, and even fewer online merchants do. So if I need to choose between FireFox and yet another widget-seller, it's an easy choice. Almost a no-brainer. Just as easy as the decision was in the first place to install AdBlock Plus.
Sunday, September 2, 2007
Bad ideas, probably ready for import
From the BBC we learn about a plan in Germany to plant spyware onto suspect's computers via spam email.The e-mails would contain Trojans - software that secretly installs itself on suspects' computers, allowing agents to search the hard drives.
There would only be "a few" of these (no specification of how many) and for a limited time (no hint of how long).
I don't know what the situation is with German law, though news reports are cited that privacy laws may be violated by this. In the US, of course, there would (in theory) need to be court approval, but as we've seen lately, that's not really required; it's more of a suggestion, just being in the constitution and all.
I give it 6 months before it's done here, and 6 months after that before the story breaks and we find out how many cases it's been used on. And it'll be a much higher number than anyone expects.
And of course, this raises some questions. Will antivirus software be "updated" to ignore "official" spyware? If I find it and delete it anyway, is that taken as proof of malicious or criminal intent? Is it interfering with an investigation?
Feh.
Monday, August 27, 2007
Missing the point, again
A discussion of digital restrictions management (DRM) on the BBC website quotes the usual sources, and gets the usual party line. That music was released on CD's, without DRM, so people expect it to be available. Video, on the other hand, hasn't been released in a non-crippled protected form and so it's perfectly okay to keep doing that, or so says the industry spokesdrone:
"There isn't a contradiction of approach between the physical and digital products... Video content and DVD has always been very protected - people do not expect to copy DVDs easily."
[sigh] No. Let's take this from the top. Again.
Users have certain rights. They're a matter of law. The fact that the MPAA threatened and bribed hardware manufacturers into selling crippled players that don't protect those rights, does not mean those rights don't apply. Just because the MPAA could do it, and dare anyone to stop them, doesn't mean they were right to do so, or that there's any presumption that they should continue to do so.
But as long as no one complains, they can go right on ignoring the law.
Saturday, August 25, 2007
On the other hand....
1,617,840How Many Germs Live On Your Keyboard?
Useless-trivia dept: This is supposedly approximately the same as 324 toilet seats.
