CHAOS MANOR MAILMail 165 August 6 - 12, 2001 |
||
CLICK ON THE BLIMP TO SEND MAIL TO ME
LAST WEEK Current Mail NEXT WEEK The current page will always have the name currentmail.html and may be bookmarked. For previous weeks, go to the MAIL HOME PAGE. FOR THE CURRENT VIEW PAGE CLICK HERE If you are not paying for this place, click here... IF YOU SEND MAIL it may be published; if you want it private SAY SO AT THE TOP of the mail. I try to respect confidences, but there is only me, and this is Chaos Manor. If you want a mail address other than the one from which you sent the mail to appear, PUT THAT AT THE END OF THE LETTER as a signature. I try to answer mail, but mostly I can't get to all of it. I read it all, although not always the instant it comes in. I do have books to write too... I am reminded of H. P. Lovecraft who slowly starved to death while answering fan mail.
Search: type in string and press return.
or the freefind search
|
||
If you subscribed: If you didn't and haven't, why not? Highlights this week: Search: type in string and press return.
|
||
This week: | Monday
August 6, 2001
I have a LOT of mail on this subject: In a world full of web sites where style dominates and content is almost non-existent, yours is a breath of fresh air. Personally, I'd far rather read something thought-provoking than marvel at the font it's written in. Regards Colin o o I think your website looks fine. About the only improvement I wish you would consider is placing the "Monday, Tuesday, etc" jumps and amazon ad in a frame. Other than that, who would want anything more? Gregory W. Brewer o o Dr. Pournelle, Don't worry, be happy. I, for one and I am sure there are many others, actually LIKE websites that are high on content and low on bells and whistles. I usually go somewhere else if a site looks like it was designed by an interior decorator- I can't stand the wait for the graphics or the Java. If I wanted bells and whistles, I'd go to a large department store franchise. Keep up the good work. And remember, you can't please everyone, so just keep yourself happy. George Laiacona III o o Dr. Pournelle: If were to buy a new book you had written, in which the typography was ridiculous, the pleasure fleeting, and the price exhorbitant, I would complain, and I would be justified. However, I read your website for content; there are just enough doodads for easy navigation, and the price is certainly more than fair. Could I afford it, I would pay double your rate for subscription access. In order for you to add the bangles and beads your correspondent wanted, I'd have to put up with longer download times (on my end, that's shared access (5 users) to a 56K dialup) and probably less content, as you would have to charge more and/or hire an assistant. This would decrease your total output, which wouldn't make me happy either. Also, we're holding at Netscape 4.7x and IE 5.0, because both work and because newer versions raise some suspicions--sometimes things can be ascribed to malice AND incompetence. So, to your correspondent who calls this site "hideous": What do you want, sir? Egg in your suds? Mark Thompson o o Jerry, I don't think your web site is hideous at all. Now, let's take a look at what these people consider to be a good web site. It has frames (meaning each item is smaller on the screen and you need to scroll more to read it), lots of animated GIFs (which take longer to download on a modem and are often very distracting), fancy colored backgrounds and muticolored text (so that often you can't read the text because it doesn't contrast with the background) and a message that says "Sorry I haven't kept the page up to date, I'll try to do more this wwek". At the bottom of the page it says "Last updated 07/06/2000.". I'll take your "hideous" web site over these any day. Chris Keavy There was a lot more, but this is a representative sample. Thanks to all who responded. Clearly I think content is better than form, which is just as well because I don't really have time to devote to form anyway: but I also don't think this is all that hideous. It's readable and you can generally find what you want: what more is there? Ah well. o o Jerry, I'm a fan. Period. But your recent article contains something that I either am totally clueless about or..... Why do you refer to CDR and DVD-RW as "silicon" storage. Their isn't any silicon in these media at all. The basic platter material is polycarbonate, a tough plastic. In most media there is a reflective layer, gold, gold/silver, or aluminum. In the writable media, there is a layer which contains a dye or optically or magneto-optically active material, depending on whether the disk is CDR, PD or ...... I just don't see any connection to silicon, though. Thanks for the years of informative and entertaining reading. Cliff Alas, there's not much to say. Please pass the crow... subject: music creation software. hi, Always enjoy your columns. Used to buy Byte just to read them. Why is it so strangely compulsive to read "first I installed the xxx. Nothing happened, so I took out the yyy, then reinstalled the xxx" It worked. Recommended". I guess because I am always installing and reinstalling the xxx too. Anyway, I have recently been looking into home music recording, you know, browsing the vendor sites, and the online review magazines, checking out what's new. I was surprised to find that the good old generic PC has quietly reached another turning point in performance, perhaps one that will render some professional recording studios obsolete ( except as nice ambient places to hang out), so I thought I'd let you know. Maybe you can muse on it in a future article, or else maybe it's old news to you.. Basically, musicians and reviewers online are openly talking about throwing away/selling off their sound effects hardware, because their PC can now completely contain a professional recording studio in SW.. No need for external HW samplers, synthesizers etc. ( eg comments at www.cubase.com) And the really breakthrough thing, the thing that fired my imagination is this: you can now play your instrument using the PC as the sound processor - in REAL TIME. The only non-standard thing you need is an upgraded sound card. (Obviously, you need loadsa ram, fast disk, fast CPU, but that's standard these days). I'm not talking about soundblasters, but dedicated hi-quality digital I/O sound cards, which are incredibly spec'd with 24 bit/96Khz inputs at very low latency, and for just a few hundred bucks. One obvious looking choice is a base model card for $128 (www.hoontech.com) For that, you get a very low latency sound card and after that your all digital, and in the PC you can emulate vintage synths and amps, entirely in SW. You need the low latency for the realtime playback. The Soundblaster Live! card works, but with 150ms latency it sounds like an echo chamber. Dedicated digital I/O sound cards claim 2ms latency. So fast you dont notice, even with your CPU running effects algorithms in SW. ( you know: chorus, flange, reverb, etc.. the stuff that wannabe musicians like me used to stare at in music shops, wishing they could swing the $100 for each effects box). I'm pretty jaded as far as 'new stuff' goes. I got 3 PCs, hooked up through a linksys router to a high speed cable, I work from home often, I thought the PC had no surprises left for me. However, the idea of a CPU ( my CPU! ) implementing professional grade sound effects in real time really amazed me, and I plan to get one of these cards ASAP. All that's stopping me is analysis paralysis, there are so many choices. Of course, there are system problems galore no doubt, and I have seen countless bulletin boards with musicians swapping hardware config stories that would rival your own epic battles! All part of the fun of having PCs, and I'm ready to dive in. So, in conclusion, gee whiz! PCs are great! I have found my killer app, and I'm off to buy a new sound card, and try out my infinitely flexible sound effects processor/home studio I didn't know I had.. -- tom loftus Good to hear from musicians. I'm not very musical, having been deaf since I was in the Army... Can anyone help here: Jerry, I have several reels of old super 8 family movies that I would like to convert to VHS or DVD. Obviously, DVD is the desired media. Unfortunately, I do not own an 8 mm projector or I would just play the movie and record them with a Camcorder. I would like to obtain the highest quality affordable. Do you have any recommendations as to how I should pursue this project? Thank you in advance. Randall M. Haney Instructional Program Manager Aviation Center of Excellence Florida Community College at Jacksonville 13510 Aerospace Way Cecil Commerce Center Jacksonville, FL 32215 rmhaney@fccj.org I don't have a clue, but if it can be done one of the readers will know.... Jerry, Metricom/Ricochet to be auctioned by the Bankruptcy Court on August 16...jim dodd http://www.metricom.com/auctioninfo/ Jim Dodd Yep. It goes away Wednesday sometime. Earthlink has sent me an announcement. Alas and alack. I just spent a day and a half eradicating Sir Cam from some office PCs, and you might be interested in a sure-fire manual method that worked for me. I described it in today's post in my weblog. Here's the link: http://www.geocities.com/maranathapmoore/archive/20010804.html#Thursday - Pete Moore IT Engineer, Precision Design Systems personal email: peterm00re@email.com <mailto:peter.a.moore@usa.net> personal website: www.peteranthonymoore.com <http://www.peteranthonymoore.com>
|
This week: | Tuesday, August
7, 2001
T Mr. Haney Here are some links to companies that will convert home movies to DVD. They use digitizers that work directly off the flim print and will give higher quality than projecting and recording. I hope this helps Mike Plaster http://www.floridaphotoandvideo.com/htm/about.html http://www.watrousvideo.com/services.html http://www.homemoviesoncd.com/ http://www.cjstechnologies.com/ http://www.endzoneproductions.com/EP_Pages/DVD_VIDEO.htm Thanks! The following is Eric Pobirs commenting on some earlier remarks by Robert Bruce Thompson. In fairness I ought to edit both these and present them since they are both excellent, but given time restraints I am going to put this up as I have it from Eric. It is long and very worth your reading. I intend to copy this discussion and other comments to a special page devoted to this discussion, but for now here is is:
Abuse of the RICO started almost immediately upon its inception. It has been a favorite tool of censorship. Among the first victims was a chain of comic book shops in the Mid-West. Some people who were in a fury over the very existence of material like 'Omaha the Cat Dancer' decided that their presence in the 'Adults Only' section was no different than handing them out to small children on playgrounds. Due to this chain's geographical dispersion and a strained interpretation of RICO a case was made that moving inventory between stores fit the bill for racketeering. It was an incredibly grotesque misuse of power and breach of justice in so many ways. No illegal material was involved. No crimes were committed. Just some blue-noses in power who decided that the potential for committing a crime (a child seeing a drawing of a naked anthropomorphic feline, the horror, the horror) was a de facto crime in of itself. There have been numerous other cases, many involving old cronies of Charles Keating. Long before he was ripping off retirement funds he was a anti-porn crusader. Harlan Ellison wrote a scathing attack on him long before the S&L scandal broke and then had the sad satisfaction of writing an I-told-you-so sequel. >>[From Robert Bruce Thompson] I think XP is going to be a disaster for Microsoft. I predicted that W2K > would be a disaster for them, and it has been. But XP is going to make them > long for the good old days when they were able to sell at least a few copies > of Windows 2000. They're losing ground big time in server-space to Linux, > and XP just puts them further behind. As Linux and other Open Source stuff > continues to develop, Microsoft is going to start losing desktop market > share as well. Realistically, I'd guess that it'll be at least two years > until Linux shows up as anything more than an asterisk in desktop market > share, but once Linux gets its foot in the door the floodgates will open. In > five years, I think Microsoft will be seriously sweating the desktop. In ten > years, I believe that Windows will be a minor player on the desktop, if > indeed it's still around at all. Excuse me but in what way has Windows 2000 been a disaster? Because it hasn't sold half a billion licenses yet? Every other company in the industry would be thrilled to death to have the revenue Win2K has generated thus far. It was a foregone conclusion that once the release date slipped past the third quarter of 1999 that it was going to sell much less than it otherwise could have as part of Y2K update efforts. By the time most companies that had just freshly rolled out tons of new hardware and OS installs were ready for another major update there would be at least one generation of Windows after 2K released. This is unfortunate circumstance but hardly disaster. Microsoft admitted they expected to take a big hit on sales for this very reason. They would have been better advised to release an NT 5 without all of Win2K's features in time for Y2K purchases but they've also become paranoid about the unwarranted accusations of requiring constant upgrades. This is one of the myths that just won't go away. I sincerely doubt that anybody in the executive offices in Redmond believes they're going to get the world to buy each and every new release. Windows 98, for instance, moved a lot of boxes at retail but those sales are a tiny slice of the pie compared to licenses shipped with new PCs. If you were the type to build your own box the changes in PC hardware and standards made it quite reasonable to buy the newer release, unless all you cared about was running Win95 faster and not taking advantage of any other new items. A fair number of people bought upgrades for their old machines that shipped with Win95 or even earlier but a vast number didn't. They didn't feel the need for a new PC and didn't see or know about any compelling features in the newer OS release. Then more time passed. The numbers are currently rather low with the outbreak of post dot.bomb sanity but even so every day there is somebody with a an elderly PC (or someone buying for the first time) buying a new PC with an OS many generations removed from the one they've been using. I'm hard put to see why this is a bad thing. It makes the company keep trying in an industry where nearly all of the competition has suicided. (It's easy to blame Microsoft but most of these companies put themselves out of business.) If you bought a new car four years ago does it offend you that the same company keeps coming out with newer models every damn year? Do you feel any great compulsion to get a new car every time the new model years appears? Microsoft knows perfectly well they aren't going to sell every new release to every potential user. Keep in mind, the main reason WinME was released was at the behest of PC vendors more than anybody else. They made that clear at the WinHEC where the Betas were distributed to attendees. From a fresh install the sheer volume of material to pull down from Windows Update to bring a brand new machine up to snuff was immense. There was also the issue of providing updated drivers to match what PC vendors were starting to ship. The schedule on the consumer version of Win2K slipped so badly it was killed and WinME was the consolation/booby prize. >> I think Microsoft is a dinosaur, built on an unsustainable revenue model, > and I think what we're seeing with XP is that dinosaur thrashing in its > death throes. Microsoft needs most people to be willing to pay much more > money much more often for their products, and I don't see that happening. > Some will, of course, but fewer and fewer as the years pass. And as revenues > decline, stock prices are going to follow, which kind of makes you wonder > how all those folks working 80 hour weeks for little pay are going to react > when they find out that their stock options aren't worth what they expected. Microsoft is obviously trying shift business models but I think you're seriously mistaken about what they have in mind. Keep in mind that they've told anyone that would listen for years that the stock price was grossly overinflated. That plays havoc with the whole basis of employees slogging away in the quest to become vested. Reports have indicated that many more new hires are requiring a straightforward salary arrangement. (I'm already well into a discussion of what I think Microsoft intends and what I'd like to see in another post I started a while back, so I'll have to refer everybody there for my opinion on that) The market is changing but I do not believe it will ever go over to Linux as the desktop standard. Time and again the people who make such pronouncements forget the vast gulf that separates them from the people who make up 95% of the market for this stuff. If you've never worked answering a customer service line let me assure you, the cupholder and 'any' key people are real.They thankfully don't make up the bulk of the populace but they are more numerous than those of us who can get comfy with a Unix CLI. One of the things that drove some of the huge PC sales in recent years (in addition to the Internet hype) was that the average user could finally do everything he was ever likely to need without ever seeing DOS. Consider how truly simplistic DOS is compared to a typical *nix CLI and you start to see the problem. But hey, we have GUIs for Linux now! Doesn't that fix everything? No, it doesn't. To a novice a GUI design by and for the techie crowd can still possess a daunting learning curve. Despite the valiant efforts of a small set of contributors the designs are dominated by people who not only do not care about those who aren't inclined to become technically grounded but regard them with open contempt. This makes for a psychological handicap that is evident through the Linux world. I recently went to several of the biggest name distribution's sites to download their latest. I had put aside a testbed system and wanted to approach the undertaking as a typical Windows users looking at Linux for the first time. By the end of the day I found myself wondering if Wag-Ed had not bribed the people responsible for these site to make people with no Unix background feel decidedly unwelcome and better off under Windows. And this was just to download a CD-R image. One friggin albeit huge file. Despite what people may think of their distribution the friendliest by far was, unsurprisingly, Red Hat. Then the trick was installation. It would actually have been a reasonable experience if there hadn't been an utter showstopper almost immediately. The idiot partitioning couldn't comprehend that I wanted it to use the entire disk. I wanted to simply blow away any existing partition and put its preferred default on the system. It simply would not do it. I had to exit the installer, boot a Win98 floppy and remove the single FAT32 partition on the disk. When presented with a seemingly blank drive the Red Hat installation could proceed. This little hitch would have been the end of the line for a big chunk of the market. At best a user with no understanding of drive partitions would have ended up with a Linux system that only gave the use of a fraction of the disk for native operations. I find things like this all over Linux distributions. I still amazingly see arguments where one person lament the range of things an ordinary desktop user is required to invoke a CLI to control and another person responds by saying he doesn't understand why this should be a problem. These issues don't stand out to the people doing the design or most of the existing Linux audience because so much of these technical concepts are so deeply embedded in their minds. For them (and I admit it hits me too when I'm working with novices) not understanding these things is akin to meeting someone who cannot operate their own legs but has nothing physically wrong. I was greatly amused by the message Jerry recently received from a Linux adherent who insisted that Jerry's take on the OS was completely wrong but offered nothing other than his insistence in support of the statement. My experience puts me in agreement with Jerry. Given the choice Linux types are always going to favor performance over mainstream accessibility. This isn't necessarily a mortal sin. Producing a product by and for a certain class of people is fine if that is your market but it is unreasonable to expect that to translate into the mainstream. It would be a crippling constraint for some applications to have the training wheels welded on. There is a big difference between the needs of someone editing vacation videos and someone preparing a film for theatrical release. The latter needs much more powerful tools that will inevitably demand more effort to use to their fullest. The professional editor understand this and doesn't resent it. It's part of why he can make a good living at this task. If this same person went into software development and believed that the consumer video editing package on the shelf at Best Buy should have the same learning curve he would soon find himself losing out to competitors with a better understanding of the target market. Their products may not be as technically good but that is less of a penalty than coming up short in usability. With this in mind it's easy to understand why the portions of MacOS X Apple is most protective of are those elements that turn BSD into something traditional Mac users can enjoy and use. The difference between Apple and the people working at Linux GUIs is that Apple has no confusion about their mission. It's not about writing code to make the world a better place. It's about getting PAID to write that code, and to generate continuing revenue from it. There is no reason you cannot be both benevolent and a capitalist if your motives are clear. Which brings up another point. Will Microsoft stock decline? Sure, as will the stocks of just about every company in the industry. Those that survive that is. The Linux companies however have already seen their stocks decline into the organic waste receptacle, especially those with the most interest in desktop Linux. Although it may break Stallman's little heart to admit it, some of the most critical coding work on bringing Linux into the mainstream has been bought and paid for by these companies, often thanks to capital made available through outrageously overgrown IPOs. That capital is now dried up and lays offs are endemic in the tech industry. Many of those coders with families to support are going to be looking elsewhere for a living. Just about all of the place looking to hire such people want them working on server-side apps. Microsoft, like many others, is in for a decline as the industry shrinks. That same decline is going to also slow down a great deal of Linux development. Guess who can survive this trend much longer than the already shrinking vendors of desktop Linux distributions and apps? Eric Pobirs And More from Eric on Windows XP; again very long and worth your time: This is a confrontation that has been a long time in coming. It goes to the heart of what separates the PC from most other systems where the hardware and operating system come out of the same company and building your own isn't an option. The same conditions that doomed the likes of the Amiga and Atari's, and relegated Apple to a niche position also negated the entire issue of OS piracy. People might copy an upgrade without authorization but properly those aren't treated as direct profit drivers but rather enhancements to the attractiveness of the platform. Although the story of Bill Gate's annoyance over the copying of Altair BASIC is widely told and Microsoft has for years put great effort into fighting industrial level piracy (where the copies are passed off as genuine MS products and sold for profit), they've always been pretty easy going about abuses at the end-user level. It accounted for a meager fraction of the market compared to new PC sales. So we never had anything more onerous than a 25 character string to provide in order to get the OS up on as many machines as we might choose. The CD's and floppies before them have always been easily copiable and there were never any hardware dongles. (Being forced to buy the hardware solely from Apple arguably makes current Macs a rather large dongle for Mac OS.) I believe that a great change is on the horizon that has Redmond extremely worried. No, it isn't Linux or Robert Reich's wacky suggestion that the Feds turn Windows into a Social Security benefit. (I'm not kidding, he really is pushing this.) The real threat to Microsoft is a lack of reasons to buy new PCs. For years it has been the commonly held (although utterly false) that every new release of Windows was so resource hungry as to necessitate major hardware upgrades or the outright purchase of a new system. True, the base RAM and hard drive space consumed has gone up but the cost and performance of these resources has dropped so quickly as to make such complaints meaningless. CPU speed has also completely outstripped any increase in cycle time demands. Processors have advanced so rapidly that the realistic minimum system for Windows 2000 had not been available for retail purchase for nearly three years when the OS was released. The fact is Windows has always run pretty well on a cheap PC. The market for used computers is such that the machines turning up at charity thrift stores are capable of the most common tasks. Things have even moved backwards with the latest generation. Despite its additional features Windows XP uses less runtime resources than Windows 2000. Thus we have tens of millions of machines, at the very least, that need no major modifications to run the latest and greatest. Additional RAM might be recommended but with 256 Megabyte DIMMs falling under $50, this is truly trivial. Hobbyist are now building systems with a gigabyte or more of RAM, not because they have an application that benefits from it but just from the giddy excitement of owning such a beast. The Ferrari for grocery shopping is now a common reality for the PC world. So, what to do with this embarrassment of riches? Machines that handle the great majority of practical applications with Good Enough (TM) performance are sufficiently behind the technology curve to no longer appear in the retail channel. For the last 20+ years it seems like this industry has been striving toward a goal with no idea what to do when they arrived. What are the mainstream applications that demand still more power? It isn't games. The best selling titles in recent times have not been terribly demanding on the hardware front. Also, the PC game industry is miniscule in comparison to the console business. Newer consoles with hard drives, broadband connections, and support for greater than NTSC resolutions are closing the gap that previously assured that certain game genres could only prosper on the PC. Nor are the niche markets that previously required workstations from SGI, Sun, and many fading or forgotten brands going to provide the sales that allowed full featured Pentium 4 systems to enter the market at under $2500 compared to the $5000 demanded by a barely usable 386 machine when those first appeared. The market for expensive workstations will happily reappear if that is what it takes to keep those users supplied. Professionals can afford the investment if the power is truly there to make a difference. The price delta above mainstream PCs will never be as heartstopping though, it has now become a relatively simple matter to gang together many inexpensive machine to perform the most computationally intensive portion of these tasks. The Newtek Screamer never really materialized but now places like Digital Domain and Boeing can achieve the same effect for cheap by ganging together unused workstations after the secretaries have gone home for the day. The only machine that needs be more expensive than the rest is the station with a high-end video card for real-time display of the job in progress, if the task in question has a complex visual aspect. In the early Eighties one of the main things that separated business systems from 'home' computers was bitmapped displays. Business applications didn't need such frivolous items until Xerox tried and Apple succeeded in demonstrating that a bitmapped display could play a vital role in making users more productive. Among other benefits a GUI made the multitasking OS much more accessible to the casual user. These displays and interface designs carried a major increase in consumption of memory and CPU cycles. This came just in time to prevent a serious industry slump as the then major increases in bang for the buck became overkill for the applications of the day. (When was the last time anyone gave much space to discussing spreadsheet recalculation performance? Personally, I cannot recall anyone in the last five or so years mentioning a need to turn off auto-recalculation for even the biggest Excel projects.) The GUI revolution drove the market for PC power growth for a few years. Audio has also come to be recognized as worthy of the business environment, and it's nearly impossible to buy a motherboard chipset today that doesn't include 3D rendering hardware that would have been the envy of any CAD workstation in the Eighties. But what is the Next Big Thing that will drive up the minimum for Good Enough (TM) computing? Speech recognition? It has gotten very good but most people simply aren't interested. (It's even now an element in video games but Sega's 'Seaman' and Nintendo's 'Hey You, Pikachu!', both virtual pet simulators, have not been successes.) Corel has bundled speech input functionality into their office suite for several years, yet most microphones bundled with PCs are never taken out of the plastic bag. Despite decades of its depiction in film, television, and literature very few able bodied people feel much compulsion to talk to their computers. Like many other innovations the place for speech recognition is not sitting at a desk. It may catch on when household sensors become pervasive but without even factoring in the current privacy paranoia this isn't going to be in many homes this decade. Immersive environments aka virtual reality. Again, great in media depictions but not terribly compelling in real life. Except for gaming, most people have little desire to be 'inside' their computer. Even if they did there is still the lack of adequate display technologies at consumer prices. Again, this type of product isn't likely to show up soon enough to avoid a devastating slump in PC sales. Using 3D for the user interface in non-entertainment applications has been widely demonstrated but has not seen any real deployment to the mainstream. VR cheerleaders continue to claim it would be a wonderful thing to have an online shopping mall depicted as a 3D world. To me this just seems to be Quake without weapons. If the virtual mall contains avatars as annoying as their real world counterparts but I can't shoot them, what is the advantage? A real mall at least gives me the chance for some vitally needed exercise. There will always be some whining voices from the perpetual thin client contingent saying that the fabled Internet Appliance will pick where the PC left off. These people are pursuing a fantasy market. Every single unit in this category ever sold to date (including the Sega Dreamcast just to pad the numbers) doesn't measure up to 10% of the worst year's PC sales since the mid-Eighties. There just isn't that big of a population that wants limited internet access but won't bother to learn the minimum needed to use a PC. Additionally, those willing to learn but driven by cost can do better with a used PC. Pentium 1 systems are so undervalued now that they are often shipped to the third world rather than put to use in the poorer parts of the US. So, barring the appearance of some hot new product that demands a 2 Ghz P4 or the latest Athlon, the PC industry is going to shrink considerably in the next few years. With that condition the guy who builds himself three or four new machines every year and at least as many for friend and relatives, all with an OS installed from the same Windows CD, becomes a much larger portion of the market. Making sure this newly enlarge market segment is paying for their product usage becomes a pressing issue. This will be a more difficult challenge for Microsoft than the anti-trust threat. The Linux contingent will say they have the solution: don't pay for your OS. This is hardly a solution when they've yet to demonstrate their economic viability. (Red Hat has already admitted their single profitable quarter is not likely to be followed by another for a while.) Further, they've yet to produce something viable in term of mainstream usability. So far, the only companies who can claim to be making serious money on Linux are those that charge serious amounts for their applications (which aren't GPLed) and/or in connection to high margin server hardware. This has little bearing on the PC side of things. A look at available Linux applications shows that the users best served are those with the most in common with the developers. Huge chunks of the products that comprise this industry are to serve the needs of the kinds of users who will never write a single line of code. You have to serve those who will pay for the product. Corel discovered very painfully that producing a very popular freebie is not a functional business model. So how does Microsoft get people to pay for every installation without creating huge dissatisfaction? Some of it is unavoidable as a certain percentage of these people will never get over their buy once, install many habit. Much lower prices may be the first thing to consider. The price for Windows XP Home (single CPU support and other deficiencies http://www.microsoft.com/windowsxp/guide/comparison.asp ) is expected to be $99. At what price would the people who do their own installations become (most of them) honest and not need any annoying activation schemes? Should the Windows CD be freely copiable and the user only charged at activation time? If people hacking past the activation scheme doesn't become an overwhelming handicap this could reduce the cost of distribution considerably and go a long ways toward making the lower price point happen. This approach would also be much more compatible with subscriptions. What you get for that subscription is the big selling point. The on/off switch approach is not going to fly. Once something is paid for it has to keep working forever without additional charges except perhaps for functions dependent on external services. (Since you would need a functional relation to those services it would matter much if they let you download the code or not.)Likewise certain additional elements must be freely provided: bug fixes, security patches, and updated drivers unless the update is purely in the form of new features and not corrections to existing code. The subscription should provide those things that sweeten the system as time goes on. Let users start off with a minimal OS for very cheap and (if they don't prefer a third party alternative) add things like MP3 Pro encoding with a higher priced subscription. If a newer version with more features, new codecs, etc. are released during the period of the subscription they're freely available for download. It might even be worth having the bare core OS be a freebie to act as heroinware and really confuse the Feds. This could mark a big change in how the OS and applications are developed. Currently commercial products suffer from the death march mode of needing to achieve a given feature set by a fixed date. As high speed connections become available to a wider portion of the market it will be more viable to bring some of the piecemeal upgrade process already familiar in many Linux products to the mainstream. If some clever coder comes up with a major improvement to the grammar checker in Word (after reasonable testing) it can become available to Word subscribers immediately instead of waiting for the next generation of Office release. This has a lot of pluses for developers and users alike but there are serious problems too. QA testing for PC products is already nightmarish. (I spent three years doing it for games on many platforms in the Eighties, when it was comparatively simple.) How much worse does become when you have many generations of components within a single application? A vexing problem but not insurmountable, I think. If this is the subscription structure Microsoft offers I'd be perfectly happy to use it. I believe it would provide a better experience for developers and users alike. The actual monetary amounts are still up for debate but this system greatly reduces the cost of distribution so one would hope they would focus on volume over margins, especially in the consumer sector. There would be great additional profits to be enjoyed from opening up the subscription infrastructure for use by third party publishers. Giving MS a few pennies on each subscription would be much easier for a small company than creating their own system. Eric Pobirs I will have more on this in the column. Obviously an important subject. Thank you, Eric. And finally, Eric on the Code Red attacks: Once again a coder had an ample opportunity to do some truly serious damage and only resorted to defectively implemented mischief. This adds to my thought that it was a remarkable coincidence that the company that first reported the Index Server hole was also the first to spot it. The second worm to use the hole also does no damage of its own but it leaves the server open to anybody to do most anything. The guy picking your lock isn't stealing anything but he's leaving the door open in case anybody else has an interest. Although it's less damaging the first worm is more worrisome than the second in terms of future legislation. I have little sympathy for someone who after more than a month failed to install the patch and then had his server vandalized. If in the course of a month you couldn't find fifteen minutes to download, install, and reboot the server, then perhaps you need to be in another line of work. What makes the first worm more worrisome is the growing issue of liability. At what point does the net become regarded as such a vital service that it gets the same kinds of protection enjoyed by other utilities? This gets scary because the FCC cannot realistically enforce any compliance with a required patch or update unless they know where the servers are and who is responsible for them. Are we going to see a web server version of the ham radio license? And what of virus attacks on clients. One very important aspect of Sircam is its lack of dependence on Microsoft software. It wouldn't be terribly difficult to examine the top ten mails for any OS and see how they store their address book. This and the means to detect which email app is installed on the system could be added to virus. The only limiting factor would be the availability of other systems using that OS and the gullibility of its users. The same kind of Linux user who would open up an attachment of unverified trustworthiness is also the same type who makes themselves vulnerable by logging into their system as admin. If Linux becomes more mainstream this will only become more common. I've been finding with a lot of clients that it can be very difficult to quickly convey why it's better NOT to have total control of their PC most of the time. Some of them immediately appreciate the added safety and understand that it won't change anything for the great majority of the time they use the machine once they have all their apps in place. (The ridiculous permissions required by Corel Office 2000 are not helping, though.) Some of the others take it as a direct insult with the implication they need to be protected from themselves. The idea of proper account management and only logging in with administrative powers when is purely necessary is going to be very slow to spread in the mainstream market. User education will continue to be the chief hazard to security, well ahead of any code deficiency. To use a line from the automotive world, the most dangerous part of a car is the nut behind the wheel. In the PC world PEBCAK will continue to be the biggest sources of problems. Eric Pobirs Which is quite enough to think about on my birthday... Dear Dr Pournelle, Thanks for posting Mr Pobir's eloquent notes. You will get a large amount of mail on this - I'll be brief. The one area where I had a problem was his assertion that the Red Hat Linux install wouldn't take over a hard drive. There are four classes of installation; workstation, server, custom, and update. RH server class install blows away a hard drive. I discovered this the hard way. At no time will you enter a disk partitioning scheme, because it assumes 'default' and simply proceeds to wipe everything. This is mentioned prominently during the installation. I simply failed to read it, thinking I would get a choice later. Regards, TC -- Terry Cole admin@maths.otago.ac.nz System Administrator Dept. of Maths and Stats, Otago University PO. Box 56, Dunedin tel:64-3-4797739 NEW ZEALAND fax:64-3-4798427
|
This week: |
Wednesday,
August 8, 2001
Eric hits the nail on the head!! Again! And I agree wholeheartedly with his take on Linux\Unix for the average user (those Aunt Minnies out there that you always refer to). And I know that you will get tons of flack from those die-hards. Just ignore it. Love your column and your site (again, just ignore those who say your site needs more "glitz", they need a life). Keep up the good work. Sorry to hear about Poul Anderson. He was one of my favorite authors. Especially way back when, when I started reading sci-fi instead of westerns. A fan. Roger D. Shorney We will all miss Poul. Thanks. You and Eric Pobirs both said that Linux coders will always choose performance over ease of use. I must disagree. Sweeping blanket statements like that one almost always have exceptions. The group of people developing free software these days is so big and diverse that you cannot usefully predict anything they will "always" do. In particular, I'd like to bring to your attention the research being done to improve the GNOME desktop environment. Sun Microsystems had a test group, of non-nerds, try out a computer running GNOME and record what they thought. As one might expect, they found some parts of GNOME to be very confusing and strange. Now, the GNOME developers are planning to use the results of the test, to change GNOME and make it easier for non-nerds. This is good. http://developer.gnome.org/projects/gup/ut1_report/report_main.html There are several points about this I would like to bring to your attention. First of all, Sun is paying for the research, but the results will benefit any GNOME user (whether GNOME is running on Solaris, HP-UX, Linux, BSD, or whatever). Why is Sun doing this? Enlightened self-interest: when GNOME is Good Enough, Sun can ship GNOME on their workstations instead of CDE. CDE has a licensing cost that Sun must pay, while GNOME is free; by spending money now to improve GNOME, Sun will be able to lower their costs, and everyone else gets a free ride. Free software works that way: someone improves something, and everyone benefits. If there is a piece of software that I like to use, and it has an annoying flaw, I can go in there and fix the flaw; this has nothing to do with performance and everything to do with ease of use. Right now people are working to make the kernel more efficient; but other people are working to make the fonts look better, and other people are working to make setup easier, and other people are working on easy-to-use office applications, and so on. Second of all, it is GNOME that Sun was testing, and not KDE. KDE is built on top of a library called Qt, owned by a company called Trolltech. Qt ships under two licenses, depending on what you want to do with it. If you only want to write GPL software under it, you can do so for free, but if you want to write proprietary software under it, you must pay money to Trolltech. Sun isn't interested in paying money to include KDE in Solaris; GNOME for free is a better deal for them. Some people (such as Nicholas Petreley of InfoWorld) think this is a shame; I think it is rational self-interest, and to be expected. Even if you believe KDE is better than GNOME, Hastings's Law applies: Cheaper and adequate tends to win over more expensive but better. Third and finally, none of the problems found by the report were fundamental architectural bugs. The basic plumbing GNOME is built upon is sound; what needs work now is to clean things up and put polish on everything. This can happen quickly. GNOME has been playing catch-up, but it is now to the point where it can be quickly made to be as good an environment as any version of Windows. In fact, because it is so very customizable, you can make it look almost identical to any particular version of Windows, if you so desire. (Or almost identical to a version of MacOS, which infuriates Apple; if you want GUI buttons that look like gumdrops, they want you to buy their hardware to get them. Sorry, I digress.) Eric Pobirs also wrote about the difficulty of making money in the Linux market. It *is* possible to make money from a free software product... but it is more difficult, and there is less money to be made. If and when free software takes over a majority of the market, there will be less money being spent on software, which is bad for software companies but good for users. Actually, even knowing nothing about free software, you might have been able to forsee this. If you look at the history of software, and attach price tags to various features, the price tag always falls over time. Once upon a time WordStar and SpellStar were separate pieces of software, sold separately; before long all word processors were expected to include spell checkers. The same thing happened with mail merge. Not that long ago, a TCP/IP stack was something for which you paid real money; now each OS is expected to include TCP/IP. I'm sure you can think of many examples too. The basic features that people need and want on their desktop, both OS and applications, are well understood at this point. The target may be moving but it isn't moving fast. Free software will be there soon. If you are dedicated, you can already run a pure free software desktop; you just need to learn to avoid the land mines and sharp edges. In the near future the land mines will be gone and the sharp edges filed off. But also in the near future, the inertia of the Windows desktop assures Microsoft steady sales of their software. It will be a royal pain to convert thousands of perfectly good Word documents into probably-slightly-wrong AbiWord documents; so why bother? As long as Microsoft doesn't charge too much, or screw up the software too bad, people will just keep paying. In business, a few hundred dollars every few years isn't so bad. But Microsoft may yet screw up the software bad enough to make switching attractive. Product Activation is bad; for example: http://www.zdnet.com/anchordesk/stories/story/0,10738,2779746,00.html If Microsoft ever tries to force their users into renting instead of owning, that would be a major mistake. If the customer has to pay money every month or Office gets turned off, completely free software will start to look very attractive. Sorry this is so long. I could make some cuts to it if you like. -- Steve R. Hastings "Vita est" steve@hastings.org http://www.blarg.net/~steveha I refuse to get into big discussions on what "might be": Microsoft is aware of what I think on Activation because I told the product managers in face to face discussions; what I don't know is what they will actually do as opposed to all the speculations. I find it very hard to believe that Microsoft has any notion that users will pay a monthly fee. I never heard any Microsoft person express that view: it seems to be like the opinions many have about me, one of those opinions imputed by an opponent that achieves the status of legal fiction that cannot be questioned. Perhaps that is Microsoft's goal, but I doubt it. As to Linux developing, my views are not the same as Eric's or as yours, but I tend to think that Linux attracts the kinds of people who will favor performance over usability, and I see no evidence to the contrary in most Linux discussions even among those who think they have a different view.
Jerry: I just heard about a new kind of virus that seems to spread using Adobe Acrobat. See: http://news.cnet.com/news/0-1003-200-6808673.html?tag=mn_hd Perhaps your prediction (on your web site of Monday, Aug. 6) of trouble for Adobe because of there prosecution of hackers is coming true! David C. Barnes 1115 NE Orchard Dr. Pullman, WA 99163 GO W.S.U. Cougs......... Something that is not worth doing is not worth doing well. (J. Pournelle) News.com reports that the first known PDF-carried virus (PDF being Adobe's enhanced text format) has been seen this week. Only those with PDF authoring tools are affected; the viewer is said to be safe. http://news.cnet.com/news/0-1003-200-6808673.html?tag=mn_hd Coincidence? I suspect not... - Mike Earl
Fascinating, isn't it? And I doubt Adobe has paid the real price yet... Jerry, Chris Smith wrote "Does the average Windows 2000 (non-corporate) user log in with admin privileges? Most definitely. Is this a fault of the OS? No. It's easy to configure. It's easy to secure. It's relatively easy to protect...A competent Windows 2000 administrator can make the average W2K install as secure as the average Unix installation. There is no substitute for competent administration. The home user does not want to perform administrative tasks." ["Mail" 28 July 2001, at http://www.jerrypournelle.com/mail/mail163.html ] Unfortunately, at lease one software maker is making in necessary for users to log in with adminstrative privileges in order to play their game. I recently upgraded a friend's PC to Windows 2000 Professional (fresh installation), because I was tired of the instability of Windows 98. I set him up as a Power User, so he could install software without having to call me. This setup worked fine for over a month, until he got Diablo II a few days ago. Local Administrator access is REQUIRED to play the game on Windows 2000 (but apparently, not with NT 4.0, see http://www.blizzard.com/diablo2/). I spent over an hour searching the internet for a solution -- most of which involved registry hacks -- but nothing actually worked. So now, my friend's only options are to log in as Administrator to play the game, or make his user account a member of Local Administrators. (It should also be noted that he is using AT&T's cable modem service, so his PC is usually on the network. I'm trying to convince him of the need for a router/firewall right now). As long as game makers like Blizzard are going to encourage the casual use of administrator access -- rather than having it be reserved for administrative functions on the rare occasions it's needed -- they are making it much harder for people like us to persuade others to practice even mild security measures, and create more secure computers for our non-technical friends. It almost makes me want to get out the manacles, parachutes, airplane, and rhinoceroses for the software developers at Blizzard.... (see "Mail" 01 August 2001 at http://www.jerrypournelle.com/mail/mail164.html ) Robert Racansky Oh my o my... I submit this to the collective wisdom here: I was wondering if you or your readers could help me out with where to find some info: I want to build a small 3-phase ac generator and electric motor ( how small? small enough to be easily hand-cranked. Something that won't kill any young students who simply cannot resist touching it to see if there really is electricity being produced in those wires). I need plans, or a book that will tell me where to find the info to draw up my own plans. Suggestions on suppliers would be greatly welcomed as well. Sincerely, Jim Snover I suspect we will know soon enough. And Indeed: Hi Jerry, In answer to your query "Does anyone know of a good source of information for people like the young lady mentioned above? " regarding easy to understand security information: Try this link from CERT: http://www.cert.org/tech_tips/home_networks.html "This document gives home users an overview of the security risks and countermeasures associated with Internet connectivity, especially in the context of "always-on" or broadband access services (such as cable modems and DSL). However, much of the content is also relevant to traditional dial-up users (users who connect to the Internet using a modem). " Recommendation number 7 on that page is "Keep all applications, including your operating system, patched". Guy Kelsey Saskatoon, Canada
t
|
This week: |
Thursday,
August 9, 2001
The day was devoured by locusts.
|
This week: |
Friday, August
10, 2001
I will at some point collect the relevant parts of this discussion into one place. Dear Dr Pournelle, Happy BIRTHDAY- I hope this finds you and yours well. I enjoy your site and your columns immensely. Next time you see Larry Aldridge say hi for me. I once worked for him here in Minnesota.( at Lucht Engineering) Sorry to hear about Poul Anderson too. I really like his stories ( yours too) and will miss his contribution to the Sci-Fi world. Sigh. A quick comment as regards Mr. Pobirs comments: On the difficulty of Linux installs- Yes the average user will have trouble with them. I submit that the casual PC user would also have difficulty doing an MS install ( based on the number of hours spent fixing PC's and questions I get from PC users/owners in my hockey association) on many hardware platforms. For example- My Dad was unable to get his Xerox printer to work (M750) on the default Win2k install- Why? because Xerox ( after a year of 2k availability) didn't ship a driver for 2k. Stupid vendor - Yes, but still an install problem. Dad would have had to download 3MB of stuff ( actually I did and cut a CD) and install to get this going. Ditto the MB vendors etc. The average home user has NO clue what to do if their computer should require a windows reload. Most can't even tell you the vendor of the hardware and the MS stuff doesn't ( shouldn't either) have the drivers on the install CD. One should note that MS spends a TREMENDOUS amount of time and money perfecting installation software and process. Yet ALL the PC vendors then spend even more dollars and man-hours getting the "pre-loads" sorted out. If the same amount of time and money were spent on Linux machines the installers and packages would perform as well or better than the MS ones. As regards Virii and Worms- MS software is the target of all this because its: 1- easy to craft, release and propagate MS based virii and worms. 2- the biggest computer OS target population on the planet We should all be thankful that these virii/worm writers haven't melded some really toxic payload ( say a variant that wipes the HD and Flash after 10 days of silent infection) in all the latest round of hits. As regards users keeping track of MS security patches I direct you to Russ's comments on NTbugtraq as to how HARD it is for a professional admin to try and keep track of the required patch set / hot fixes. MS has NOT done a very good job on that front. Sincerly, Paul R. Cole bdssprc@wavetech.net o o o Dear Dr. Pournelle, You said, "I tend to think that Linux attracts the kinds of people who will favor performance over usability." I don't think it's a case of "performance or usability", as I find Linux far more usable than Windows for the kind of work I do. I do a lot of data processing/analyis and communications work. The Unix toolset and philosophy is ideally suited for this kind of work. I frequently am able to extract and summarize data from large flat files while talking to the client. More complex tasks are readily implemented as well. The command-line oriented toolset is dead easy to script, and scripts are dead easy to chain together into more complex tasks. Unix pipes provide cheap and efficient multithreading with ease. The Windows toolset (Office, et al) is another story entirely. The heavy dependence on the GUI drag-n-drop methodology can really slow things down. I honestly can't stand to use it for any real work. It's like driving down a back-country road -- stuck behind a farm tractor! So, I think that the concept "usability" really depends heavily on the user and the task. For what I do, Linux gives me maximum performance *and* usability. For Aunt Minnie, maybe something else. Best, Gordon Runkle -- It doesn't get any easier, you just go faster. -- Greg LeMond o o o Jerry, I am waiting to read where you, Eric P and RBT go in your discussion of where the personal computer market is going. I own a WINTEL network in the office at my small business, and we upgrade as new applications or new customer requirements require. Just now I run an Intel PIII 800 MHz with the Intel 815 EEAL motherboard that you and RBT recommended. The OS is Windows 2000 Professional, which I have found to be as effective and efficient as my Mac OSs. We have some customer applications that require DOS as well. The one area I have run into problems on upgrades is in interfaces. When I upgraded my wife's iMac to a newer, faster machine with wireless networking capability it moved her to USB. Left behind were older serial connections. One was for her digital camera, so we are anticipating the arrival by UPS of a combo SmartCard and Compact Flash reader to work around the lack of a direct serial connectin from the digital camera to the computer. I have other hardware that talks to my Mac G3 via the Apple Desk Top Bus, and yet another that uses the printer port serial connection. I haven't talked to the manufacturers yet to see if they have either a FireWire or USB solution. jim dodd San Diego o o o Jerry: Here is my dilemma, recently had to re-install Win2K. No problem, install off CD, then download and install service pack2, and finally install any updates found at windowsupdate.microsoft.com. My question is this, is there an easy way to find any updates Microsoft has released that haven't found their way to the windowsupdate site? The reason I ask is that Microsoft takes anywhere from 2-3 months to add any windows updates to their website. I have several ways of finding security releases as they are released, since half the web sites I visit mention new releases as they come out. I just have no way of keeping tabs on what was released 1 month ago. It seems that I am always operating with a 2 month gap in security releases. I suppose the ultimate blame is mine for not properly cataloging any updates I download post SP-2, but I had counted on windowsupdate to be at least relatively current. Especially with the current trend these days for hackers to find and exploit any security holes found in windows. rock on dimitrios stathopoulos I have not found appreciable lag in the updates page? And there is a reply. Dear Dr. Pournelle, Thank you for everything. I have been a huge fan of you since you were writing your adventures with Zeke, Zeke II, Adelle, Adeline, and Lucy van Pelt :) long ago, in my home country. Now I write you first because I want to thank you for the stand you've chosen to take re: the Sklyarov case. I hope that finally the corporations and the U.S. government may regain their sanity. But second, I would like to comment on your casual remark that the Linux users tend in general to favor performance over usability. That was in connection with some remarks -rather daring and not very guarded, IMHO- by Eric Pobirs, and a reaction to them. You are right. Linux users, as a whole and in general, tend to favor performance over usability. But let me qualify your statement a little bit. The proposition is entirely true if we identify 'simplicity' and minimalism with 'performance'. The existence of distributions such as Slackware is proof of it. Linux users in general abhor bloatware and useless bells and whistles. However, the Linux community has also made great advances in the area of usability. A reader's comment about the usability study of the GNOME desktop environment, by Sun, is on the track. And there is also the K Desktop Environment. Dr. Pournelle, have you ever tried to use KDE 2.1.x? I was a "mixed bag" user, using Linux for learning and hacking, and Windows for the everyday tasks, until I got KDE 2.1 some months ago. That did the trick and now I am a full Linux user. KDE is such a great, powerful and friendly environment that perhaps can make that same trick to many more users like me. I would encourage you -and would urge to Eric Pobirs- to try it and 'feel it'. There is another trait in the 'usability' field that is very strong on Linux and very weak in other OSes: respect for the user. Right now I am using Kmail as my email client. It is fully graphical, has a nice set of rules, and whatever. But also *it allows me to disable by default the HTML rendering* of all messages. In this way, I can be very confident that the annoyance and the security risks inherent in the HTML code are dealt with. Try to do that with Outlook, Outlook Express, Netscape Messenger, or even Mozilla Mail. If a sender issues an email in HTML format only, you're stuck with HTML rendering. with the exception of Kmail which is a Linux mail client. Besides that, I surf the web with Konqueror, the web browser and file manager of the KDE environment. It is small, capable, and fast. But its two most noteworthy features are related to consumer respect also. First of all is kcookiejar, with manages cookies with a database that has permissions stored for all websites you surf. If a website you surf for the first time sends you a cookie, a dialog pops ups asking you what to do. You can choose between allow or reject: only that cookie, all cookies from that site, or all cookies. The second important feature is a checkbox in the configuration dialog that allows you to disable window_open() in JavaScript, so all the annoying popups are gone. Without editing some arcane config file. Without downloading any bloated, third party software. That is why many users are migrating to Linux. Linux is an operating system that despite its rough corners seem to treat its user with respect. That is usability, also. I would like to disagree with Eric Pobirs about the no-no of a Command Line Interface. You, Dr. Pournelle, know command lines very well since your CP/M days and perhaps before. I do, too, and I am pretty sure you will be prepared to say that as an average computer user of Windows, sooner or later you will have to face someday a command line for some emergency procedure. In that case I would ask: would you rather have the full Linux system with bash and all the utilities and the online documentation, or the C:\> prompt? I agree that an average user should not be *forced* to use the command line for getting any job done, but if so, whose command line is friendlier? There's another twist to that. Historically, the Windows GUI and kernel were put on top of the real-mode DOS, the one that remains basically unchanged since the DOS 3.30 days. So, in the Windows world, the only way to take advantage of the full power of your processor was -and is- to use the Windows GUI environment. But Linux is an OS whose command line does not have that constraint and so, there are plenty of command-line or console apps that can get the job done and are *very* friendly, even for a novice user. So, and perhaps you may agree with me on this, a command line is not and should not be taboo. Well, enough of my writing. Sorry for consuming your time and your patience with this, and thanks for your understanding. Yours, Eduardo Sánchez, -- Eduardo Sánchez Th.M. student, Calvin Theological Seminary --------------------------------------------------------- "Ierusalem, Ierusalem, convertere ad Dominum Deum tuum!" I like command lines for some things too, but as memory fades it is maddening to try to open a file to edit and get nothing because I typed in the name incorrectly. Easier to point and click. I liked COMMANDER back in DOS days for exactly that reason. Commander was GOOD ENOUGH since it had both command line a a point and shoot. Icons aren't all that interesting. Point and shoot is. Jerry, Microsoft has forced more upgrades for its software with file format changes than it has for new features. That's especially true of the office product but it also pertains to the OS. For example, several programs now require IE5.5, although that product differs little from IE5 in functionality while being demonstrably less stable. Even windows controls in DLLs are changed substantially, rather than replaced with different named components or having functions added to allow allow backwards compatibility. Still, the true theft's commited by Microsoft or in its regular decision with the last four versions of Office to change the format for DOC and XLS files. Technically, this last change to XML does apply, but the concept is the same. And it's no coincidence that releasing total control on file format is where Microsoft started its stance to go to the subscription models. Up until XP, it was maintaining the Office revenue stream by selling the latest version of the product to big customers, who would then send v97 files to users of v95 and then v2000 files to v97 users. The small guy usually had no choice but to upgrade too. XP has the now standard file format change, but it will be the last to be dictate by Microsoft. And it's the file format issue that has had as much as anything to do with the reluctance to give Linux and say StarOffice a chance. As long as StarOffice is slightly buggy (although usually no worse than a point-oh release of a MS product) AND has problems reading the MS Office file format of the moment, there's NO chance of Linux gaining a position of trust. With XML not the moving target Office file formats WERE, StarOffice has a chance at compatibility. And with that compatibility, I think it might have a chance to gain a foothold it hasn't deserved until now. Combined with the fussers and fidgeters at home, the foothold into the business community might start to create the critical mass to get Linux really up and going. Don't overlook Kylix's ability to bring a popular RAD programming environment to the linux world also. Until Linux is the alternative it WILL become, I'll join Bob Thompson and others as non-users of XP. I don't need anything it has to offer. At least I hope not. GM Gary Mugford Idea Mechanic Bramalea ON Canada I will reserve comments for a while.
A Roland pointer, with the heading "Utter Contempt" http://www.theregister.co.uk/content/6/20894.html -- ---------------------------- Roland Dobbins <mordant@gothik.org> Indeed And Dear Dr. Pournelle, Take a look at the picture gallery in http://www.rawa.org . The site also tells how these pictures were made, by whom, and what happens when they are caught. Any resemblance to practices of our own US government is the conclusion of the reader, arrived at independently and without my help. regards, William L. Jones wljones@dallas.net I do have to say that I don't find that kind of malice in the United States. But power seekers are power seekers, and empires have to mind everyone's business. In Adams time we could say "We are the friends of liberty everywhere but the guardians only of our own." Today we sacrifice our own liberty to be involved in doing good everywhere. Let's hope it's worth it.
Jerry, Perhaps a box on your site that takes would-be question writers directly to google? Here is a google return on the query <3 phase AC machines for students> http://www.elmomachine.com/turbo_500.htm As an old Navy electrician, I can attest to how much fun they are to play with, but even 50 volts and the right conditions can yield 100 to 200 mA through the chest -- that will stop the heart...jim dodd The Rest of the Story: Dear Dr. Pournelle: Just thought you might like to know the rest of the story on the 18 year old National Honor Society student who was banned from her high school graduation and faced serious felony charges for having a 5" kitchen knife in her car in the school parking lot. The State's Attorney basically said 'I can't prove she knew it was there, so I'm dropping the case.' She still didn't get to participate in her graduation ceremony, and had the interesting experience of being arrested, booked and released on bail before common sense prevailed. Or maybe the firestorm of web activism had something to do with it. This link: http://www.freerepublic.com/forum/a3b0d689a77a3.htm leads to a message board with a lot of back and forth about this incident, including lists of the US mail, e-mail, web, etc addresses of the principal, the sheriff, the state's attorney, etc. For a brief recap of the story and final outcome, this link works: http://channelonenews.com/feature/2001/06/20010605_3/ Tim Morris Never ascribe to malice that which is adequately explained by incompetence.
On Ricochet:
The auction of Ricochet wireless, in it's current form will mean the end of an incredibly useful method of wireless access to the internet - perhaps the only practical methd that could be available within the next 5 or so years. Other methods that are proposed are far more expensive and are mostly captive methods of the TELCOs, the companies who have put virtually all DSL mid-tier players out of business. Ricochet was independent of the phone companies. While they had no understanding of their customer (advertising 75 dollars per month to college students was senseless) and over-built their network (Ricochet modems would work for miles outside of stated coverage areas) they were finally beginning to understand what they needed to do to survive when they ran out of money. Metricom was running tests of marketing and advertising strategies in Dallas, Atlanta, and San Diego before finally shutting down. Testing a marketing campaign must be a first in Silicon Valley. Their product worked, and worked well indeed. Auctioning Ricochet at this point will kill the best national wireless broadband devised to date. The court documents show an intent to auction the spectrum separately from the infrastructure, which will effectively kill the viability of the modems, pole top units, and gateway. I urge you to discuss in the press the value of someone "taking over" the entire Ricochet system, in toto, so that it can continue to provide a viable alternative to what will shortly be a TELCO monopoly of all wireless broadband. Bill Kennon Not much I can do but cry... More on Linux: Jerry, Since there's been a lot of sturm und drang lately about Linux vs. Windows I thought you might find my tale interesting, or at least relevant. About a month ago I began gathering parts for a new workstation to replace my 3 year old Celeron system. I decided to give Linux a whirl on the new system, since I could keep Windows going on the old system during transition and not risk any downtime. I checked around and determined that I could do everything I needed in the Linux environment, except use my Quicken and Quickbooks. But those applications are supported under Win4Lin, and I figured that at worst I could run them via a terminal session off my W2K server. I started working with PCs on a Tandy 1000SX, running TandyDOS 3.2, ran through DR-DOS, DesqView, ran OS/2 for several years, and still depend on the command line even in Windows; Linux held no fears for me. But I went into the project with a certain 'hassle budget' in mind; if getting everything working on Linux became too much trouble, off it would go, to be replaced by Windows. As someone once said, people don't buy computers to run operating systems, they buy them to get work done. After a few misadventures with online vendors (a few are very good, many mediocre, and a few are very bad, but that should not come as a surprise) all the parts were finally in one place and I got the machine built. I allocated 5 GB of the 40 available to Windows 2000, planning on a dual boot system, and let Windows install. It didn't recognize my sound, video, or network, but I had the drivers handy and was up and running pretty quickly. Then came Linux-Mandrake. The install went very smoothly, recognizing all my hardware and installing a graphical boot manager so I could dual boot between Windows and Linux. I was impressed. Then I tried to load VNC, which I use for remotely managing my own and clients' servers. The package would not load off the CD. Hmm. Tried a few other packages. Nope; corrupt package messages. Hmm. Perhaps the packages CD hadn't burned right. It took only a few minutes to FTP the ISO image over to the new machine (which had a 16x Plextor burner, very nice, and far superior to the 2x in my old machine), but then the fun began. I tried two different Linux GUI based burners (which are just a front end to the command line tool), and the command line tool, and could not figure out how to get it to burn a CD from an image. I managed to waste one CD burning the ISO image onto it as a file. Grumble. After about 45 minutes of fooling around in this manner I cut my losses, booted into Windows 2000, installed the Plextor burning software, and had my CD in about 8 minutes (counting time to install the software; that Plextor is _fast_). Back into Linux. The packages still would not install. Try downloading the VNC package and installing from the downloaded file. No joy. At that point I called it a night, and returned fresh to the attack the next morning. No joy. Downloaded the generic binaries from the VNC website. Success! Okay, I had email working in a manner of speaking, could access the web and remote control my machines. Time to move the new system out of the living room and into my office, where I could hook it up to my main monitor (all this had been done on an ancient 14" screen). My wife was thrilled to have the mess out of living room, but now I was face to face with another problem. The fonts were horrific. Depending on the application, or the web page, they varied from ugly to completely unintelligible. Sigh. I did a bit more research and determined that this is a common enough problem that there's a Font Deuglification Mini How-To. Well. Time went by taking care of other things, and at length I got some Windows fonts imported into Linux and my display up to at least legible, if not great, quality. At this point about 4 days had passed with me still using my old Celeron 464MhZ system as my main workstation while the nice dual Piii-850 system idled between my efforts to hammer Linux into workable condition. I still had not approached the matter of access to my Windows network, printing, the bulk of the applications I would need, or moved my data, and the CD burning issue remained unsolved. My hassle budget had been reached. While I remained confident that, if I had to, I could do everything I needed to on a Linux system, I could get there quicker and easier via Windows. I wiped the system, loaded Windows and began installing applications. The next day I was able to start using the new system for real. I was quite impressed with the parts of Linux that worked well, but in the end the amount of work involved in getting it up to the same functionality and ease on my eyes (and I don't just mean pretty, but actual eyestrain) as Windows was just not worth the reward. People will only put up with extra hassle to get _more_ than they would have gotten without the hassle, and there'd better be a lot more. At this point, Linux just doesn't offer enough more than Windows 2000; the satisfaction of saying, "I'm running Linux" is insufficient reward for most people. --Robert Brown Apparently Linux doesn't much care for us old dinosaur killers (who grew up fighting mainframes and promoted distributed computing...)
|
This week: | Saturday,
AWARDS and social stuff all weekend
|
This week: | Sunday,
August 12, 2001 Book signing at Universal City
|