• Cast locally, stream globally

    Here’s a great idea for local TV news departments: start streaming, 24/7/365, on the Net. You don’t need to have first-rate stuff, and it doesn’t all have to be live. Loop fifteen minutes of news, weather and sports to start. Bring in local placeblog and social media volunteers. Whatever it takes: you figure it out.  Just make it constant, because that’s what TV was in the first place, and that’s what it will remain after the Internet finishes absorbing it, which will happen eventually. Now’s the time to get ahead of the curve.

    Here’s why I thought of this idea:

    . Far as I know it’s the only serious TV that’s live, streaming 24/7/365 on the Net. I watch it on the iPad wherever we have it… in the car, on a cabinet in the bedroom, or — in this case — on the kitchen counter, next to the stove, where I was watching it while making breakfast yesterday morning. That’s when I shot the photo.

    At our place we don’t have a TV any more. Nor do a growing number of other people. Young people especially are migrating their video viewing to the Net. Meanwhile, all the national “content” producers and distributors are tied up by obligations and regulations. Try to watch NBC, CBS, ABC, TNT, BBC or any other three- or four-letter network source on a mobile device. The best you can get are short clips on apps designed not to compete with their cable channels. Most are so hamstrung by the need to stay inside paid cable distribution systems (or their own national borders) that they can’t sit at the table where Al Jazeera alone is playing the game.

    That table is a whole new marketplace — one free of all the old obligations to networks and government agencies. No worries about blackouts, must-carries and crazy copyright mazes, as long as it’s all the station’s own stuff, or easily permitted from available sources (which are many).

    Savor the irony here. Al Jazeera English is the only real, old-fashioned TV channel you can get on a pad or a smartphone here in the U.S. It’s also the best window on the most important stuff happening in the world today. And it’s not on cable, which is an increasingly sclerotic and soon-to-be marginalized entertainment wasteland. A smart local TV station can widen the opportunities that Al Jazeeera is breaking open here.

    Speaking as one viewer, I would love it if , , , , or had a live round-the-clock stream of news, sports, weather and other matters of local interest. We happen to live at a moment in history — and it won’t last long — when ordinary folks like me still look to TV stations for that kind of stuff, and want to see it on a glowing rectangle. Now is the time to satisfy that interest, on rectangles other than those hooked up to antennas or set-top boxes.

    And if the TV stations don’t wake up, newspapers and radio stations have the same opportunity. Hey, already puts Dennis and Calahan on . Why not put them on the Net? And if NESN doesn’t like that (because they’re onwed by Comcast), WBZ can put  on a stream. The could play here.  So could and . ‘BUR already has an iPhone app. Adding video would be way cool too.

    The key is to make the stations’ video streams a go-to source for info, even if the content isn’t always live. What matters is that it leverages expectations we still have of TV, while we still have them.

    And hey, TV stations, think of this: you don’t have to interrupt programming for ads. Run them in the margins. Localize them. Partner with Foursquare, Groupon, Google or the local paper. Whatever. Have fun experimenting.

    Yesterday , the king of local TV consultants (and a good friend) put up a post titled The Tactical Use of Beachheads. Here are his central points and recommendations:

    There is, I believe, a way to drive the car and fix it at the same time, but it requires managers to step outside their comfort zone and behave more like leaders. The mission is to establish beachheads ahead of everybody else, so that when the vision materializes, they’ll be prepared to monetize it. This is a risk, of course. There’s no spreadsheet, no revenue projections to manage, no best practices, no charts and graphs, because it’s not about seeing who can outsmart, outthink or outspend the next guy; it’s all about anticipating new value and going for it. The risk, however, can be mitigated if the beachheads are based on broad trends.

    This can be very tough for certain groups, because we’re so used to being able to hedge bets with facts and processes. Here, we’re leapfrogging processes to intercept a moving target. It’s Wayne Gretzky’s brilliant tactic of “skating to where the puck is going to be,” instead of following its current position.

    In our war for future relevance, here are five beachheads we need to establish in order to drive our car and fix it at the same time. Four of them relate to content that, we hope, will be somehow monetized. The fifth deals specifically with enabling commerce via a form of advertising.

    1. Real Time Beach — It is absolutely essential that media companies understand that news and information is moving to real time, and that real time streams are what will really matter tomorrow. It’s already happening today, but until somebody makes big money with it, we’ll continue to emphasize that which we CAN make money with, the front-end design of our websites. These streams take place throughout the back end of the Web, and they will make their way to the front end, and soon. There are early signs of advertising in the stream, and we should be experimenting with this, too. This is an unmistakable trend, and if we don’t move and move fast, it’s one I’m afraid we’ll lose.
    2. Curation Beach — Examples like Topix above show that curation beach is really already here, although I’d call those types of applications “aggregators.” They’re dumb in that they’re simply mechanical aggregators of that which is — for the most part — being published by others. Curation is more the concept of helping customers make sense out of all the real time streams that are in place. We’re all using the streams of social media, for example, to “broadcast,” but the real value is to pay attention and curate. This is a beachhead ready for the taking.
    3. Events Beach — One of the key local niches still left for the taking is the organizing of all events into an application that helps people find and participate. The ultimate user application here will be portable, for it must meet the needs of people already on-the-go. I refer to this beachhead as “event-driven news,” and it is largely created and maintained by the community itself. Since many events dovetail with retail seasons, this is easily low-hanging beachhead fruit.
    4. Personal Branding Beach — If everybody is a media company then media is everybody. This is a fundamental reality within which we’re doing business today, and it presents a unique opportunity for us and our employees. The aggregation of personal brands is a winning formula for online media, and we should be exploiting it before somebody else does. Our people are our strongest asset for competing in the everybody’s-a-media-company world, and we have the advantage of a bully pulpit from which to advance their personal brands. This is more important than most people think, because the dynamic local news brands of tomorrow will be associated with the individual brands of the community. The time to begin establishing this beachhead is now.
    5. Proximity Advertising Beach — The mobile beachhead is both obvious but obscured, because we’re all waiting for somebody to show us how to do it. This could be a real problem, for we know what happened when we allowed the ad industry itself to commodify banner advertising. Outsiders set the value for our products. The same thing is likely to happen here, unless we stake out territory for ourselves downstream first. There are predictions that mobile CPMs will hold at between $15-$25, and that’s enough to make any mobile content creator smile, but I would argue that the real money hasn’t even been discovered yet, because these CPMs are merely targeted display. Remember that the Mobile Web is the same Web as the one that’s wired, and it behaves the same way. The new value for mobile is proximity, and that’s where we need to be focusing. Let’s do what we can to make money with mobile content, but let’s also establish a beachhead in the proximity marketing arena, too, because that’s where this particular puck is headed.

    If we approach these beachheads entirely with the question “where’s the money,” we’re likely to miss the boat. This strategy is to get us ahead of that and let the revenue grow into it. None of these will break the bank, and they’ll position us to move quickly regardless of which direction things move or how fast.

    Live local streaming on the Net is a huge beachhead. I see it on that kitchen iPad, which only gives me Al Jazeera when I want to know what’s going on in the world. The next best thing, in terms of moving images, is looking out the window while listening to the radio. Local TV can storm the beach here, and build a nice new business on the shore. And navigating the copyright mess is likely to be lot easier locally over the Net than it is nationally over the air or cable. (Thank you, regulators and their captors.)

    And hey, maybe this can give Al Jazeera some real competition. Or at least some company on TV’s new dial.

    [Later…] Harl‘s comment below made me dig a little, so I’m adding some of my learnings here.

    First, if you’re getting TV over the Net, you’re in a zone that phone and cable companies call “over the top,” or OTT.  ITV Dictionary defines it this way:

    Over-the-top – (OTT, Over-the-top Video, Over-the-Internet Video) – Over-the-top is a general term for service that you utilize over a network that is not offered by that network operator. It’s often referred to as “over-the-top” because these services ride on top of the service you already get and don’t require any business or technology affiliations with your network operator. Sprint is an “over-the-top long distance service as they primarily offer long distance over other phone company’s phone lines. Often there are similarities to the service your network operator offers and the over-the-top provider offers.

    Over-the-top services could play a significant role in the proliferation of Internet television and Internet-connected TVs.

    This term has been used to (perhaps incorrectly) describe IPTV video also. See Internet (Broadband) TV.

    But all the attention within the broadcast industry so far has been on something else with a similar name: over-the-top TV (not just video) which is what you get, say, with Netflix, Hulu, plus Apple’s and Google TV set top boxes. Here’s ITV Dictionary’s definition:

    Over-the-top-TV – (OTT) – Over-The-Top Home Entertainment Media – Electronic device manufacturers are providing DVD players, video game consoles and TVs with built-in wireless connectivity. These devices piggy back on an existing wireless network, pull content from the Internet and deliver it to the TV set. Typically these devices need no additional wires, hardware or advanced knowledge on how to operate. Content suited for TV can be delivered via the Internet. These OTT applications include Facebook and YouTube. Also see Internet-connected TVs.

    No wonder TVNewsCheck reports Over-The-Top TV at Bottom of Station Plans. Stations are still thinking inside the box, even after the box has morphed into a flat screen. That is, they still think TV is about couch potato farming. The iPhone and the iPad changed that. Android-based devices will change it a lot more. Count on it.

    Since Al Jazeera English is distributed over the top by , I checked to see what else LiveStation has. They say they have apps for CNBC, BBC World News and two other Al Jazeera channels, but on iTunes (at least here in the U.S.) only the three Al Jazeera channels are listed as LiveStation offerings. LiveStation does have its own app for computers (Linux, Mac and Windows), though; and it has a number of channels (not including CNBC) at . I just tried NASA TV there on my iPhone, and it looks good.

    Still, apps are the new dial, at least for now, so iPhone and Android apps remain the better beachhead for local stations looking for a new top, after their towers and cable TV get drowned by the Net.

  • The necessity of inventions

    Over on Facebook, and friends have been pondering the provenance of Invention is the mother of necessity. Writes Don, “… heard that once from Doc Searls – I think its an original. Seems more true everyday. (think facebook, smartphones, the internet, computers).” So I responded,

    Back in the ’80s, I had a half-serious list of aphorisms I called “Searls’ Laws.” The first was “Logic and reason sit on the mental board of directors, but emotions casts the deciding votes.” The second was, “Invention is the mother of necessity.”

    There were others, but I forget them right now. One, from my high school roommate (now the Episcopal Bishop of Bethlehem, PA — who blogs, in a fashion, here), was “Matter can be neither created nor destroyed. It can only be eaten.” He was sixteen when he said that.

    Anyway, one day I laid my second law on the CEO of our ad agency’s top client at the time, a company called Racal-Vadic. The CEO was Kim Maxwell, who taught at Stanford before kicking ass in business, and has since moved on to other things.) He replied, “Ah, yes. Thorstein Veblen.”

    I thought, wtf? So I looked it up, and sure enough, Thorstein Veblen uttered “Invention is the mother of necessity” about a century before I made it one of my laws.

    Anyway, my point in using it remained the same: Silicon Valley was built on inventions that mother necessity (from ICs to iPhones) at least as much as it was built on necessities that mother invention.

    Just thought I’d share that out here in the un-silo’d non-F’book world.

  • Bring on The Live Web

    I first heard about the “World Live Web” when my son Allen dropped the phrase casually in conversation, back in 2003. His case was simple: the Web we had then was underdeveloped and inadequate. dnaSpecifically, it was static. Yes, it changed over time, but not in a real-time way. For example, we could search in real time, but search engine indexes were essentially archives, no matter how often they were updated. So it was common for Google’s indexes, even of blogs, to be a day or more old. , PubSub and other live RSS-fed search engines came along to address that issue, as did  as well. But they mostly covered blogs and sites with RSS feeds. (Which made sense, since blogs were the most live part of the Web back then. And RSS is still a Live Web thing.)

    At the time Allen had a company that made live connections between people with questions and people with answers — an ancestor of  and @Replyz, basically. The Web wasn’t ready for his idea then, even if the Net was.

    The difference between the Web and the Net is still an important one — not only because the Web isn’t fully built out (and never will be), but because our concept of the Web remains locked inside the conceptual framework of static things called sites, each with its own servers and services.

    We do have live workarounds , for example with APIs, which are good for knitting together sites, services and data. But we’re still stuck inside the client-server world of requests and responses, where we — the users — play submissive roles. The dominant roles are played by the sites and site owners. To clarify this, consider your position in a relationship with a site when you click on one of these:

    Your position is, literally, submissive. You know, like this:

    But rather than dwell on client-server design issues, I’d rather look at ways we can break out of the submissive-dominant mold, which I believe we have to do in order for the Live Web to get built out for real. That means not inside anybody’s silo or walled garden.

    I’ve written about the Live Web a number of times over the years. This Linux Journal piece in 2005 still does the best job, I think, of positioning the Live Web:

    There’s a split in the Web. It’s been there from the beginning, like an elm grown from a seed that carried the promise of a trunk that forks twenty feet up toward the sky.

    The main trunk is the static Web. We understand and describe the static Web in terms of real estate. It has “sites” with “addresses” and “locations” in “domains” we “develop” with the help of “architects”, “designers” and “builders”. Like homes and office buildings, our sites have “visitors” unless, of course, they are “under construction”.

    One layer down, we describe the Net in terms of shipping. “Transport” protocols govern the “routing” of “packets” between end points where unpacked data resides in “storage”. Back when we still spoke of the Net as an “information highway”, we used “information” to label the goods we stored on our hard drives and Web sites. Today “information” has become passé. Instead we call it “content”.

    Publishers, broadcasters and educators are now all in the business of “delivering content”. Many Web sites are now organized by “content management systems”.

    The word content connotes substance. It’s a material that can be made, shaped, bought, sold, shipped, stored and combined with other material. “Content” is less human than “information” and less technical than “data”, and more handy than either. Like “solution” or the blank tiles in Scrabble, you can use it anywhere, though it adds no other value.

    I’ve often written about the problems that arise when we reduce human expression to cargo, but that’s not where I’m going this time. Instead I’m making the simple point that large portions of the Web are either static or conveniently understood in static terms that reduce everything within it to a form that is easily managed, easily searched, easily understood: sites, transport, content.

    The static Web hasn’t changed much since the first browsers and search engines showed up. Yes, the “content” we make and ship is far more varied and complex than the “pages” we “authored” in 1996, when we were still guided by Tim Berners-Lee’s original vision of the Web: a world of documents connected by hyperlinks. But the way we value hyperlinks hasn’t changed much at all. In fact, it was Sergey Brin’s and Larry Page’s insights about the meaning of links that led them to build Google: a search engine that finds what we want by giving maximal weighting to sites with the most inbound links from other sites that have the most inbound links. Although Google’s PageRank algorithm now includes many dozens of variables, its founding insight has proven extremely valid and durable. Links have value. More than anything else, this accounts for the success of Google and the search engines modeled on it.

    Among the unchanging characteristics of the static Web is its nature as a haystack. The Web does have a rudimentary directory with the Domain Name Service (DNS), but beyond that, everything to the right of the first single slash is a big “whatever”. UNIX paths (/whatever/whatever/whatever/) make order a local option of each domain. Of all the ways there are to organize things—chronologically, alphabetically, categorically, spatially, geographically, numerically—none prevails in the static Web. Organization is left entirely up to whoever manages the content inside a domain. Outside those domains, the sum is a chaotic mass beyond human (and perhaps even machine) comprehension.

    Although the Web isn’t organized, it can be searched as it is in the countless conditional hierarchies implied by links. These hierarchies, most of them small, are what allow search engines to find needles in the World Wide Haystack. In fact, search engines do this so well that we hardly pause to contemplate the casually miraculous nature of what they do. I assume that when I look up linux journal diy-it (no boolean operators, no quotes, no tricks, just those three words), any of the big search engines will lead me to the columns I wrote on that subject for the January and February 2004 issues of Linux Journal. In fact, they probably do a better job of finding old editorial than our own internal searchware. “You can look it up on Google” is the most common excuse for not providing a search facility for a domain’s own haystack.

    I bring this up because one effect of the search engines’ success has been to concretize our understanding of the Web as a static kind of place, not unlike a public library. The fact that the static Web’s library lacks anything resembling a card catalog doesn’t matter a bit. The search engines are virtual librarians who take your order and retrieve documents from the stacks in less time than it takes your browser to load the next page.

    In the midst of that library, however, there are forms of activity that are too new, too volatile, too unpredictable for conventional Web search to understand fully. These compose the live Web that’s now branching off the static one.

    The live Web is defined by standards and practices that were nowhere in sight when Tim Berners-Lee was thinking up the Web, when the “browser war” broke out between Netscape and Microsoft, or even when Google began its march toward Web search domination. The standards include XML, RSS, OPML and a growing pile of others, most of which are coming from small and independent developers, rather than from big companies. The practices are blogging and syndication. Lately podcasting (with OPML-organized directories) has come into the mix as well.

    These standards and practices are about time and people, rather than about sites and content. Of course blogs still look like sites and content to the static Web search engines, but to see blogs in static terms is to miss something fundamentally different about them: they are alive. Their live nature, and their humanity, defines the liveWeb.

    This was before  not only made the Web live, but did it in part by tying it to SMS on mobile phones. After all, phones work in the real live world.

    Since then we’ve come to expect real-time performance out of websites and services. Search not only needs to be up-to-date, but up-to-now. APIs need to perform in real time. And many do. But that’s not enough. And people get that.

    For example, has a piece titled Life in 2020: Your smartphone will do your laundry. It’s a good future-oriented piece, but it has two problems that go back to a Static Web view of the world. The first problem is that it sees the future being built by big companies: Ericsson, IBM, Facebook, IBM, Microsoft and Qualcomm. The second problem is that it sees the Web, ideally, as a private thing. There’s no other way to interpret this:

    “What we’re doing is creating the Facebook of devices,” said IBM Director of Consumer Electronics Scott Burnett. “Everything wants to be its friend, and then it’s connected to the network of your other device. For instance, your electric car will want to ‘friend’ your electric meter, which will ‘friend’ the electric company.”

    Gag me with one of these:

    This social shit is going way too far. We don’t need the “Facebook” of anything besides Facebook. In fact, not all of us need it, and that’s how the world should be.

    gagged on this too. In A Completely Connected World Depends on Loosely Coupled Architectures, he writes,

    This is how these articles always are: “everything will have a network connection” and then they stop. News flash: giving something a network connection isn’t sufficient to make this network of things useful. I’ll admit the “Facebook of things” comment points to a strategy. IBM, or Qualcomm, or ATT, or someone else would love to build a big site that all our things connect to. Imagine being at the center of that. While it might be some IBM product manager’s idea of heaven, it sounds like distopian dyspepsia to me.

    Ths reminds me of a May 2001 Scientific American article on the Semantic Web where Tim Berners-Lee, James Hendler, and Ora Lassila give the following scenario:

    “The entertainment system was belting out the Beatles’ ‘We Can Work It Out’ when the phone rang. When Pete answered, his phone turned the sound down by sending a message to all the other local devices that had a volume control. His sister, Lucy, was on the line from the doctor’s office: …”

    Sound familiar? How does the phone know what devices have volume controls? How does the phone know you want the volume to turn down? Why would you program your phone to turn down the volume on your stereo? Isn’t the more natural place to do that on the stereo? While I love the vision, the implementation and user experience is a nightmare.

    The problem with the idea of a big Facebook of Things kind of site is the tight coupling that it implies. I have to take charge of my devices. I have to “friend” them. And remember, these are devices, so I’m going to be doing the work of managing them. I’m going to have to tell my stereo about my phone. I’m going to have to make sure I buy a stereo system that understands the “mute the sound” command that my phone sends. I’m going to have to tell my phone that it should send “mute the sound” commands to the phone and “pause the movie” commands to my DVR and “turn up the lights” to my home lighting system. No thanks.

    The reason these visions fall short and end up sounding like nightmares instead of Disneyland is that we have a tough time breaking out of the request-response pattern of distributed devices that we’re all too familiar and comfortable with.

    tried to get us uncomfortable early in the last decade, with his book Small Pieces Loosely Joined. One of its points: “The Web is doing more than just speeding up our interactions and communications. It’s threading and weaving our time, and giving us more control over it.” Says Phil,

    …the only way these visions will come to pass is with a new model that supports more loosely coupled modes of interaction between the thousands of things I’m likely to have connected.

    Consider the preceding scenario from Sir Tim modified slightly.

    “The entertainment system was belting out the Beatles’ ‘We Can Work It Out’ when the phone rang. When Pete answered, his phone broadcasts a message to all local devices indicating it has received a call. His stereo responded by turning down the volume. His DVR responded by pausing the program he was watching. His sister, Lucy, …”

    In the second scenario, the phone doesn’t have to know anything about other local devices. The phone need only indicate that it has received a call. Each device can interpret that message however it sees fit or ignore it altogether. This significantly reduces the complexity of the overall system because individual devices are loosely coupled. The phone software is much simpler and the infrastructure to pass messages between devices is much less complex than an infrastructure that supports semantic discovery of capabilities and commands.

    Events, the messages about things that have happened are the key to this simple, loosely coupled scenario. If we can build an open, ubiquitous eventing protocol similar to the open, ubiquitous request protocol we have in HTTP, the vision of a network of things can come to pass in a way that doesn’t require constant tweaking of connections and doesn’t give any one silo (company) control it. We’ve done this before with the Web. It’s time to do it again with the network of things. We don’t need a Facebook of Things. We need an Internet of Things.

    I call this vision “The Live Web.” The term was first coined by Doc Searls’ son Allen to describe a Web where timeliness and context matter as much as relevance. I’m in the middle (literally half done) with a book I’m calling The Live Web: Putting Cloud Computing to Work for People . The book describes how events and event-based systems can more easily create the Internet of Things than the traditional request-response-style of building Web sites. Im excited for it to be done. Look for a summer ublishing date. In the meantime, if you’re interested I’d be happy to get your feedback on what I’ve got so far.

    Again, Phil’s whole post is here.

    I compiled a list of other posts that deal with various VRM issues, including Live Web ones, at the ProjectVRM blog.

    If you know about other Live Web developments, list them below. Here’s the key: They can’t depend on any one company’s server or services. That is, the user — you — have to be the driver, and to be independent. This is not to say there can’t be dependencies. It is to say that we need to build out the Web that David Weinberger describes in Small Pieces. As Dave Winer says in The Internet is for Revolution, don’t just think decentralized. (Or re-decentralized, though that’s a fine thing. As is rebooting.) Think distributed. As I explained last year here,

    … the Net is not centralized. It is distributed: a heterarchy rather than a hierarchy. At the most basic level, the Net’s existence relies on protocols rather than on how any .com, .org, .edu or .gov puts those protocols to use.

    The Net’s protocols are not servers, clouds, wires, routers or code bases. They are agreements about how data flows to and from any one end point and any other. This makes the Internet a world of endsrather than a world of governments, companies and .whatevers. It cannot be reduced to any of those things, any more than time can be reduced to a clock. The Net is as oblivious to usage as are language and mathematics — and just as supportive of every use to which it is put. And, because of this oblivity, The Net supports all without favor to any.

    Paul Baran contrasted centralized systems (such as governments), decentralized ones (such as Twitter+Facebook+Google, etc.) and distributed ones, using this drawing in 1964:

    Design C became the Internet. Except the Internet is actually more like D in this version here:

    Because on the Internet you don’t have to be connected all the time. And any one node can connect to any other node. Or to many nodes at once. Optionality verges on the absolute.

    Time to start living. Not just submitting.

  • Open Cardspace opportunity

    Just learned from Craig Burton that  Microsoft has killed off Windows Cardspace. Here’s the report from Mary Jo Foley. Here’s the Twitter search. Plenty of pointage to follow there. Here are Mike Jones’ reflections on the matter.

    I don’t have time to get my thoughts together on this right now, but here’s my brief take at this early point. As almost always with me, it’s optimistic:

    Good.

    What mattered most about Cardspace, or about Infocards (the non-Microsoft term) was the selector, which was something that the user operated, that was under user control. As Craig just put it to me on the phone, the selector tells a service that the client is not a machine, that the client has control, that there is human being who makes his or her own choices about identity and other variables that have always belonged under the user’s control, but that the cookie-based system to which the commercial web has been defaulted from the beginning can not recognize.

    What we (that is, developers) should do now is look at what Microsoft has abandoned, and use what we can of it to do what Microsoft did not, and apparently will not.

    Frankly, for all the great work that Mike, Kim Cameron and other Microsoft folks did in this space, the biggest problem has always been their employer. While Microsoft deserves credit for giving these good people lots of support and room to move — including open source development, no less — the legacy was always there. Microsoft was a hard company for the rest of the world to trust as a leader in an area that required maximum openness and minimum risk that BigCo moves would be pulled. Which is what Microsoft just did.

    So let’s move on.

  • Here comes a plasma bullet


    says we’re overdue for a plasma bullet from the Sun. In Saturday issue (at that last link) they have a movie of a plasma blast coming out of sunspot 1147, on the eastern (left) edge of our nearest star.

    Yesterday they reported this:

    SOLAR FLARE: Sunspot 1158 has just unleashed the strongest solar flare of the year, an M6.6-category blast @ 1738 UT on Feb. 13th. The eruption appears to have launched a coronal mass ejection (CME) toward Earth. It also produced a loud blast of radio emissions heard in shortwave receivers around the dayside of our planet. Stay tuned for updates!

    BEHEMOTH SUNSPOT 1158: Sunspot 1158 is growing rapidly (48 hour movie) and crackling with M-class solar flares. The active region is now more than 100,000 km wide with at least a dozen Earth-sized dark cores scattered beneath its unstable magnetic canopy. Earth-directed eruptions are likely in the hours ahead.

    And today they say,

    EARTH-DIRECTED SOLAR FLARE: On Feb. 13th at 1738 UT, sunspot 1158 unleashed the strongest solar flare of the year so far, an M6.6-category blast. NASA’s Solar Dynamics Observatory recorded an intense flash of extreme ultraviolet radiation, circled below:

    The eruption produced a loud blast of radio waves heard in shortwave receivers around the dayside of our planet. In New Mexico, amateur radio astronomer Thomas Ashcraft recorded these sounds at 19 to 21 MHz. “This was some of the strongest radio bursting of the new solar cycle,” he says. “What a great solar day.”

    Preliminary coronagraph data from STEREO-B and SOHO agree that the explosion produced a fast but not particularly bright coronal mass ejection (CME). The cloud will likely hit Earth’s magnetic field on or about Feb. 15th. High-latitude sky watchers should be alert for auroras.

    The shot above is one I took on a flight from San Francisco to London in 2007. For some reason it flew far south of the usual route, giving me a somewhat distant view of the aurora that showed up outside my window on the left side of the plane. It was my first good view of one.

    I shot another set here on a later flight a few weeks later. When flying to and from Europe, always get a seat on the north side if you want to see the show.

    Meanwhile, watch SpaceWeather for more developments. Also NOAA’s Space Weather Service, which yesterday said this:

    February 11, 2011 — Region 11153 has rotated on to the far side of the Sun, having given us an M class flare and a handful of small C class flares over the past few days. Recently, there have been some small active regions appearing and disappearing that haven’t bothered to produce any interesting activity and there are currently 4 of them on the disk. What has the attention of forecasters is the Sun’s East limb, where old Region 11149 is just beginning to reappear having transited the far side of the Sun. During that transit, multiple coronal mass ejections were observed that were directed away from the Earth. If that region stays active, we could be in store for some interesting space weather in the days ahead as it moves towards the center of the solar disk.

    I have a feeling we’ll be taking a plasma bullet in the next few days.

  • Maybe the kids are alright

    I’ve been fairly quiet on the developments in Egypt, preferring to let others do the blogging, especially when they know far more than I do. (Ethan Zuckerman, for example.) But I’ve been involved in many conversations, because it’s damned interesting, what’s going on. One of those conversations is with my sister Jan, by email. She’s a retired Commander with the U.S. Navy, and a veteran at international matters as well, having served as an exchange officer with the British Royal Navy and as a protocol officer with the U.S. one.

    I liked an email she sent this morning well enough to ask her if it was cool to share it. She said yes, and here it is:

    I can’t help but believe that at least half the educated and aware (not always the same thing, is it?)  population of the world isn’t digesting yesterday’s outcome without thinking of their own government.  I liked Tom Friedman’s line in his latest column Postcard From a Free Egypt – NYTimes.com Hello, Tripoli, Cairo calling. I can feel his optimism and I have it, too.

    I don’t think this is going to be nasty to watch; I have been beyond impressed with the control the protestors have displayed in this process, and I just realized why:  Facebook may have gotten them into the Square, but it was Twitter that kept them in hand.  This was not the protest of the bullhorn, of the warping of direction by misinterpretation caused by passing the word along because the word was universally available in one shot! The age of reiteration is over.  Now is the age of the direct thought going out to all ears vs the age old chain of mouth to ear to mouth to ear….  That is the power of Twitter.

    So the message and the method stayed true.  No one went off the rails, the whole thing was non-violent in intent and in execution. And – the hitherto unimaginable – the youth stayed true to that.  Youth, who we associate with hooliganism in sports and overheated loyalty to their current cultural idols, they kept their eye firmly on the long-view.  They led their elders – the professionals who had lived under the thumb and threats of a tyrant, the educated who were stifled and stilled by fear, the political who were passively waiting.  The youth led, because they had a unity of purpose that was tightly held — or in this case twittered.

    Today I am stunned, and smiling, and … wondering.  Do our politicians realize that we, too, have an enormous disenfranchised population?  That we have a large, youth-filled population who feel they have few options or opportunities? That we have an underclass in living in a poverty that should be unimaginable in a first-world country?  That we have an eager and interested population that feels its voice cannot be heard by our government over the cacophony of corporate interests?

    And this is not the voice of the Tea Party.  I think it will become glaringly obvious  that the Tea Party was a just a segment of the frustrated, found to be useful to and thereby fueled and funding by special interests, enlarged by bored and lazy media and will eventually be fragmented by electoral fulfillment.  The population I’m thinking of has not been heard from yet.  The Administration may think that Organizing for America gives them a voice, but it hasn’t, because it is too one-way.  It is a fund-raising, message passing tool of the administration.

    The voice heard in the square in Cairo and in the streets of Egypt did not rise up overnight or out of thin air.  That voice that has been unheard because it was a voice shouting in a vacuum.  But a vacuum cannot exist in cyberspace. Traditionally in revolutions the key is to take over the one-to-many vehicles of mass communication, radio and TV.  But this time they were not taken over, they were ignored.  They weren’t needed because it was the masses that were communicating.

    So now we are in a new age, an age of leadership and governments being held accountable to the voice of the governed.  And in this new age I am optimistic for Egypt as well as other oppressed people.  I hope every autocrat and dictator is hearing footsteps in the dark.  And I hope our government is paying close attention — people have voices and, no matter how disenfranchised, they have just learned a new way to make them heard.

    Bonus link.

    [Later…] While this post has met with a fair amount of approval here and in the Twitterverse, Doug Skogland has some pushback.

    Perhaps linking to this piece by Nicholas Kristof will help.

  • Accidental Lessons: Reflections on the Challenger Tragedy

    [This piece was written for (in Raleigh, North Carolina ) and published twenty-five years ago, on February 10, 1986. Since it might be worth re-visiting some of the points I made, as well as the event itself, I decided to dust off the piece and put up here. Except for a few spelling corrections and added links, it’s unchanged. — Doc]

    I can remember, when I first saw the movie , how unbelievable it seemed that and could fly their spacecrafts so easily. They’d flick switches and glance knowingly at cryptic lights and gauges, and zoom their ways through hostile traffic at speeds that would surely kill them if they ran into anything; and they’d do all this with a near-absolute disregard for the hazards involved.

    That same night, after I left the movie theatre, I experienced one of the most revealing moments of my life. I got into my beat-up , flicked some switches, glanced knowingly at some lights and gauges, and began to zoom my way through hostile traffic at speeds that would surely kill me if I ran into anything; and I did all this with a near-absolute disregard for the hazards involved. Suddenly, I felt like Han Solo at the helm of the . And in my exhilaration, I realized how ordinary it was to travel in a manner and style beyond the dreams of all but humanity’s most recent generations. I didn’t regret the likelihood that I would never fly in space like Han and Luke; rather I felt profoundly grateful that I was privileged to enjoy, as a routine, experiences for which many of my ancestors would gladly have paid limbs, or even lives.

    Since then I have always been astonished at how quickly and completely we come to take our miraculous inventions for granted, and also at how easily we use those inventions to enlarge ourselves, our capabilities, and our experience in the world. “I just flew in from the Coast,” we say, as if we were all born with wings.

    I think this “enlarging” capacity, even more than our brains and our powers of speech, is what makes us so special as creatures. As individuals, and as an entire species, we add constantly to our repertoire of capabilities. As the educator said, our capacity to learn is amplified by our ability to develop skills. Those skills give us the power to make things, and then to operate those things as if they were parts of ourselves. Through our inventions and skills, we acquire superhuman powers that transcend the weaknesses of our naked, fleshy selves.

    One might say that everything we do is an enlargement on our naked beginnings. That’s why we are the only animals that not only wear clothes, but who also care about how they look. After all, if we were interested only in warmth, comfort and protection, we wouldn’t have invented push-up bras and neckties. Or other non-essentials, like jewelry and cosmetics. It seems we wear those things to express something that extends beyond the limits of our bodies: the notions of our minds, about who we are and what we do.

    But clothes are just the beginning, the first and most visible layer in a series that grows to encompass all our tools and machines. When we ride a bicycle, for example, the bike becomes part of us. When we use a hammer to drive a nail, we ply that tool as if it were an extra length of arm. Joined by our skills to tools and machines, our combined powers all but shame the naked bodies that employ them.

    I remember another movie: a short animated feature in which metallic creatures from Mars, looking through telescopes, observed that the Earth was populated by a race of automobiles. Martian scientists described how cars were hatched in factories, fed at filling stations, and entertained at drive-in movies.

    And maybe they were right. Because, in a way, we become the automobiles we drive. Who can deny how differently we behave as cars than as people? It’s a black that cuts us off at the light, not Mary Smith, the real estate agent. In traffic, we give vent to hostilities and aggressions we wouldn’t dare to release in face-to-face encounters.

    Of course, we have now metamorphosed into entities far more advanced than automobiles. As pilots we have become airplanes. As passengers we have become creatures that fly great distances in flocks.

    If those Martian scientists were to keep an eye on our planet, they would note that we have now begun to evolve beyond airplanes, into spaceships. In their terms we might note the Tragedy as the metallic equivalent of a single failure in the amphibians’ first assault on land. Evolution, after all, is a matter of trial and error.

    But as we contemplate the price of our assault on the shores of space, we need to ask ourselves some hard questions. For example: is the Challenger tragedy just a regrettable accident in the natural course of human progress, or evidence of boundaries we are only beginning to sense?

    On January 28th, Challenger addressed that question to our whole species. We all felt the same throb of pain when we learned how, in one orange moment, seven of our noble fellows were blown to mist at the edge of the heavens they were launched to explore.

    Most of us made it our business that day to visit the TV, to watch the Challenger bloom into fire, and to share the same helpless feeling as we saw the smoking fragments of countless dreams rained down in white tendrils, like the devil’s own confetti, to the ancestral sea below. The final image — a monstrous white Y in the sky — is permanently embossed in the memories of all who witnessed the event.

    It was so unexpected because the shuttle had become exactly what NASA had planned: an ordinary form of transportation, a service elevator between Earth and Space. NASA’s plan to routinize space travel succeeded so convincingly that major networks weren’t even there to cover the Challenger liftoff. Instead they “pooled” for rights to images supplied by Ted Turner‘s Cable News Network. Chuck Yeager, the highest priest in the Brotherhood of The Right Stuff, voiced the unofficial NASA line on the matter. “I see no difference between this accident and any accident involving a commercial or military aircraft,” he said.

    Would that it were so.

    “Fallen heroes” is not a term applied to plane crash victims. In fact, the technologies of space travel are still extremely young, and the risks involved are a lot higher than we like to think. “Since NASA made it look so easy, people thought it would never happen. Those of us close to the program thought it could happen a whole lot sooner. We’re glad it was postponed this long,” said , a former astronaut and pilot of the .

    The fact that the shuttle program was so vulnerable, and we failed to recognize the fact, says unwelcome things about our faith in technology, and now is when we should listen to them. Because the time when flying through space becomes as easy as flying down the road, or even through the air, is still a long way off. In the meantime, it might be best to leave the exploring to guys like Lousma, who are blessed with the stuff it takes to push the risks out of the way for the rest of us.

    And we’re talking about the kind of risks that were built into the shuttle from its start.

    Consider for a moment that the shuttle program is, after all, the bastard offspring of a dozen competing designs, and constrained throughout its history by a budgetary process that subordinates human and scientific aspirations to a variety of military and commercial interests. And consider how, as with most publicly-funded technologies, most of the Shuttle’s components were all produced by the lowest bidder. And consider the fact that many of the Shuttle’s technologies are, even by NASA’s admission, obsolete. If we had to start at Square One today, we’d probably design a very different program.

    A new program, for example, would probably take better account of the Perrow Law of Unavoidable Accidents. A corrolary of Murphy’s Law — “Anything that can go wrong, will go wrong” — the Perrow Law is modestly named after himself by , Professor of Sociology and Organizational Analysis at Yale University. According to Perrow, the shuttle program has succeeded mostly in spite of itself. Its whole design is so detailed, so complex, so riddled with interdependent opportunities for failure, that we’re lucky one of the things didn’t blow up sooner, or worse, suffer a more agonizing death in space.

    “The number of interconnections in these systems is so enormous,” he says, “that no designer can think of everything ahead of time. It may be that this was one major valve failing on one of the tanks, but I rather suspect that that’s not the case. NASA tests and is very concerned about those valves. They have back-ups for every major system. The problem is more likely to have been a number of small things that came together in a mysterious way — a way that we may never learn about.”

    He continues, “The chances for an accident will be only marginally reduced if we find the cause of this, and harden something or increase the welds, and eliminate this one thing as a source of an accident. But right next to it will be a dozen other unique sources of accidents that we haven’t touched. But by touching the components next to it, we may increase the possibility of other accidents.”

    , who wrote , and invented the term, suggests that NASA may have snowed itself into believing that space travel is past the pioneering stage, and that, as a concept, the shuttle’s “coach & freight service — a people’s zero-G express” was premature. Of the martyred teacher, , he says “Her flight was to be the crossover, at last, from a quarter of a century in which space had been a frontier open only to pioneers who lived and were willing to die by the code of ‘the right stuff’ — the Alan Shepards, s and Neil Armstrongs — to an era when space would belong to the entire citizenry, to Everyman. The last role in the world NASA had in mind for Crista McAuliffe and the rest of the Challenger crew was that of pioneer or hero.”

    This was because NASA had labored long and hard to break the political grip of what Wolfe calls “Astropower,” the “original breed of fighter-pilot and test-pilot astronaut — the breed who had been willing, over and over again, to sit on top of enormous tubular bombs, some 36 stories high, gorged with several of the most explosive materials this side of nuclear fission, and let some NASA GS-18 engineer light the fuse.”

    The fact was, Wolfe suggests, that McAuliffe and her companions “hurtled for 73 seconds out on the edge of a still-raw technology” before they perished. Which is why he asks “If space flight still involves odds unacceptable to Everyman, then should it be put back in the hands of those whose profession consists of hanging their hides, quite willingly, out over the yawning red maw?”

    If the answer is yes, then what will need to happen before Everyman is really ready to fly the zero-G express?

    In a word, simplification. Right now there is no way for a single pilot’s senses to stretch over the entire shuttle system, and operate it skillfully. A couple of years ago, the Director of Flight Operations for NASA said “this magnificent architecture makes it that much harder to learn to use the system.” According to Professor Perrow, “because the Shuttle system was designed in so many parts by a phalanx of designers, when it’s all put together to run, there is nobody, no one, who can know all about that system.”

    Perrow says “It requires simplification for a single person, a pilot, to know everything that’s happening in such a hostile environment as space.” One of the great simplifications in aviation history was the substitution of the jet engine for the piston engine. That’s what we need to make space travel agreeably safe.”

    It is ironic that on the day the Challenger blew up, , a space industry consultant and a former NASA administrator, was about to mail the first draft of a commission report to the president on the future of the U.S. space effort. That report advanced two recommendations: 1) a unmanned cargo-launching program to deliver cargo to space at a fraction of current shuttle costs; and 2) an improved shuttle or a new-generation system like the “hypersonic transportation vehicle” the Air Force has wanted ever since NASA beat the rocket airplane into space. The hypersonic transport would simply be an airplane that can fly in space. By contrast, the shuttle is a spacecraft that can glide to earth. Already, hypersonic transport technology has been around for years. Reports say the first “space plane” could be ready to fly in the 1990s. The thing would cruise along at anywhere from Mach 5 to Mach 25, which would mean, theoretically, that no two points on the earth would be more than three hours apart.

    But it will have to fight the inertia behind the shuttle program, which is substantial, and slowed only momentarily by the Challenger explosion.

    I fear we can only pray that future missions will continue to dodge Murphy’s law.

    Over time, however, our sciences will need to face Perrow’s Corollary more soberly. We need to recognize that there are limits to the complexities we can build into our technologies before accidents are likely to occur. Thanks to Fail Safe, Doctor Strangelove , and other dramatic treatments of the issue, we are already familiar with (and regretably taking for granted) the risks of nuclear catastrophe to which we are exposed by our terribly complicated “defensive shields.”

    And this hasn’t stopped us from committing to even more dangerous and complicated “defensive” projects, the most frightening of which is the euphemistically titled , better known by its nickname: Star Wars. Professor Perrow says “Star Wars is the most frightening system I can think of.” In fact, Star Wars is by far the most complex technology ever contemplated by man. And possibly the most expensive.

    There are cost projections for Star Wars that make NASA’s whole budget look like pocket change. Portentiously, the first shuttle experiment with Star Wars technology failed when shuttle scientists pointed a little mirror the wrong way. We can only hope that the little mirrors on Soviet Warheads are aligned more cooperatively.

    Complexity is more than a passing issue. It is science’s most powerful and debilitating intoxicant. We teach it in our schools, confuse it with sophistication and sanction it with faith. In this High Tech Age, we have predictably become drunk on the stuff. And, as with alcohol and cocaine, we’ll probably discover its hazards through a series of painful accidents.

    Meanwhile, there is another concern that ironically might have been illuminated by a teacher, or better yet a journalist, in space. Its advocates include a recently-created organization of space veterans whose non-political goal is to share their singular view of our planet. That view sees a fragile ball of blue, green and brown, undivided by the lines that mark the maps and disputes on the surface below. It is an objective view, and we need it badly.

    The implications of that view are made more sober by recent discoveries suggesting limits to the viability of human life in the environments of space. Outside the protective shield of our atmosphere, travelers are bombarded constantly by cosmic radiation that produces cancer and other ailments.

    Weightlessness also has its long-term costs. While there may be ways to reduce or eliminate the risks involved with space travel, we are still, at best, in the zygote stage of our development as space creatures. It might be millennia before we are finally ready to leave Earth’s womb and dodge asteroids in the manner of Han Solo.

    Until then, it would be nice if we didn’t have to discover our limits the hard way.

  • Northeastern lights

    As you can tell, if you read the small print in this StrikeStar map, we’re being hit by lightning (and therefore thunder) here in the Boston area, right now. After nothing but snow, over and over, for a month and a half, we get a day of rain with a Summer encore. Very strange.

  • Quiller Quote

    I’m reading Griftopia: Bubble Machines, Vampire Squids, and the Long Con That Is Breaking America, by Matt Taibbi, and loving every page of it. The prose is over-the-top in the manner of Hunter S. Thompson, without the drugged persona. Every page has at least one quote-worthy line, but the one that made me yell “Wow!” came on page 209, where he describes Goldman Sachs as “a great vampire squid wrapped around the face of humanity, relentlessly jamming its blood funnel into anything that smells like money.” And now I see that line makes it into the first paragraph of Matt’s Wikipedia entry. And he has 23,771 followers (@mtaibbi) on Twitter.

    So I’m behind the curve on this one. But we can all catch up. He blogs (and more) for Rolling Stone.

  • Al Jazeera in Egypt is cable’s ‘Sputnik moment’

    Al Jazeera story

    Cable companies: Add Al Jazeera English *now* Jeff Jarvis commands, correctly, on his blog — and also in , under the headine . For me now was a few minutes ago, when I read both items on the family iPad, which has been our main news portal since the quit coming and I suspended my efforts to reach them by Web or phone. (The Globe also wants a bunch of ID crap when I go there on the iPad, so they’re silent that way too.) So I went to the App store, looked up , saw something called Al Jazeera English Live was available for free, got it, and began watching live protest coverage from Cairo.

    We don’t have cable here. We dumped it after network news turned to shit, and we found it was easier to watch movies on Netflix. We still like to watch sports, but cable for sports alone is too expensive, because it’s always bundled with junk we don’t want and not available à la carte. (You know, like stuff is on the Web.) When we want TV news, we go online or get local TV through an gizmo plugged into an old Mac laptop. Works well, but it’s still TV.

    And so is Al Jazeera on an iPad/iPhone, Samsung Wave or a Nokia phone. (See http://english.aljazeera.net/mobile/for details. No Android or Blackberry yet, appaerently.) The difference is that real news s happening in Egypt, and if you want live news coverage in video form, Al Jazeera is your best choice. As Jeff puts it, “Vital, world-changing news is occurring in the Middle East and no one — not the xenophobic or celebrity-obsessed or cut-to-the-bone American media — can bring the perspective, insight, and on-the-scene reporting Al Jazeera English can.”

    And it’s very good. , “If you’re watching Al Jazeera, you’re seeing uninterrupted live video of the demonstrations, along with reporting from people actually on the scene, and not “analysis” from people in a studio. The cops were threatening to knock down the door of one of its reporters minutes ago. Fox has moved on to anchor babies. CNN reports that the ruling party building is on fire, but Al Jazeera is showing the fire live.”

    In fact six Al Jazeera journalists are now being detained (I just learned). That kind of thing happens when your news organization is actually involved in a mess like this. CNN used to be that kind of organization, but has been in decline for years, along with other U.S. network news organizations. As Jeff says, “What the Gulf War was to CNN, the people’s revolutions of the Middle East are to Al Jazeera English. But in the U.S., in a sad vestige of the era of Freedom Fries, hardly anyone can watch the channel on cable TV.”

    And that’s a Good Thing, because cable is a mostly shit in a pipe, sphinctered through a “set top box” that’s actually a computer crippled in ways that maximize control by the cable company and minimize choice for the user. Fifteen years ago, the promise of TV was “five hundred channels”. We have that now, but we also have billions of sources — not just “channels” — over the Net. Cream rises to the top, and right now that cream is Al Jazeera and the top is a hand-held device.

    The message cable should be getting is not just “carry Al Jazeera,” but “normalize to the Internet.” Open the pipes. Give us à la carte choices. Let us get and pay for what we want, not just what gets force-fed in bundles. Let your market — your viewers — decide what’s worth watching, and how they want to watch it. And quit calling Internet video “over the top”. The Internet is the new bottom, and old-fashioned channel-based TV is a limping legacy.

    A few days ago, President Obama spoke about the country’s “Sputnik moment”. Well, that’s what Al Jazeera in Egypt is for cable TV. It’s a wake-up call from the future. In that future we’ll realize that TV is nothing more than a glowing rectangle with a boat-anchor business model. Time to cut that anchor and move on.

    Here’s another message from the future, from one former cable TV viewer: I’d gladly pay for Al Jazeera. Even when I can also get it for free. All we need is the mechanism, and I’m glad to help with that.

  • Learnings from the Browser Wars

    The question on Quora goes, What lessons can be learned from the first browser war between Microsoft and Netscape?

    I covered that war when it broke out, more than fifteen years ago. No magazine was interested in my writing then. Blogging was several years off in the future. All we had were websites, and that was good enough. The following is what I put up on mine — in as much of the original HTML as can survive WordPress’ HTML-rewriting mill. I’ll continue below the piece…


    MICROSOFT+NETSCAPE

    WHY THE PRESS NEEDS TO SNAP OUT OF ITS WAR-COVERAGE TRANCE

    By Doc Searls
    December 11, 1995

    Outline

    Wars?

    Am I wrong here, or has the Web turned into a Star Wars movie?

    I learn from the papers that the desktop world has fallen under the iron grip of the most wealthy and powerful warlord in the galaxy. With a boundless greed for money and control, Bill Gates of Microsoft now seeks to extend his evil empire across all of cyberspace.

    The galaxy’s only hope is a small but popular rebel force called Netscape. Led by a young pilot (Marc Andreesen as Luke Skywalker), a noble elder (Jim Clark as Obi-wan Kanobe) and a cocky veteran (Jim Barksdale as Han Solo), Netscape’s mission is joined by the crafty and resourceful Java People from Sun.

    Heavy with portent, the headlines tromp across the pages (cue the Death Star music — dum dum dum, dum da dum, dum da dummm)…

    • “MICROSOFT TAKES WAR TO THE NET: Software giant plots defensive course based on openness”
    • “MICROSOFT UNVEILS INTERNET STRATEGY: Stage set for battle with Netscape.”
    • “MICROSOFT, SUN FACE OFF IN INTERNET RING”
    • “MICROSOFT STORMS THE WEB”

    The mind’s eye conjures a vision of The Emperor, deep in the half-built Death Star of Microsoft’s new Internet Strategy, looking across space at the Rebel fleet, his face twisted with contempt. “Your puny forces cannot win against this fully operational battle station!” he growls.

    But the rebels are confident. “In a fight between a bear and an alligator, what determines the victor is the terrain,” Marc Andreessen says. “What Microsoft just did was move into our terrain.”

    And Microsoft knows its strengths. December 7th, The Wall Street Journal writes, Bill Gates “issued a thinly veiled warning to Netscape and other upstarts that included a reference to the Pearl Harbor attack on the same date in 1941.”

    Exciting stuff. But is there really a war going on? Should there be?

    are the facts?

    After reading all these alarming headlines, I decided to fire up my own copy of Netscape Navigator and search out a transcript of Bill’s December 7th speech.

    I started at Microsoft’s own site, but got an “access forbidden” message. Then I went up to the internet level of the site’s directory, but found the Netscape view was impaired. (“Best viewed with Microsoft Explorer,” it said.) I finally found a Netscape-friendly copy at Dave Winer’s site. It appears to be the original, verbatim:*

    MR. GATES: Well, good morning. I was realizing this morning that December 7th is kind of a famous day. (Laughter.) Fifty-four years ago or something. And I was trying to think if there were any parallels to what was going on here. And I really couldn’t come up with any. The only connection I could think of at all was that probably the most intelligent comment that was made on that day wasn’t made on Wall Street, or even by any type of that analyst; it was actually Admiral Yamomoto, who observed that he feared they had awakened a sleeping giant. (Laughter.)

    I see. The “veiled threat” was Bill’s opening laugh line. Even if this was “a veiled threat,” it was made in good humor. The rest of the talk hardly seemed hostile. Instead, Bill showed a substantial understanding of how both competition and cooperation work to build markets, and of the roles played by users, developers, leaders and followers in creating the Internet. In his final sentence, Bill says, “We believe that integration and continuity are going to be valuable to end users and developers…”

    Of course, I wish he’d pay a little more attention to Macintosh users and developers, but I don’t blame him for avoiding them. I blame Apple, which dissed and sued Microsoft for years, to no positive effect. Apple played a zero-sum game and — sure enough — ended up with zero. Brilliant strategy.

    Think how much farther along we would be today if this relationship was still Apple plus Microsoft, rather than Apple vs. Microsoft.

    The truth is that the Web will be better served by Microsoft plus Netscape than by Microsoft vs. Netscape. Plus is what most of us want, and it’s probably what we’ll get, regardless of how the press plays the story.

    give a big AND to the Web

    So what is the best way to characterize Microsoft, if not as the Heaviest of Heavies?

    I think Release 1.0‘s Jerry Michalski gets closest to it when he says: “Microsoft thinks more broadly than any other company about what it’s doing. Its plans include global telecommunications, information creation, applications — even community building.” That tells us a lot more than “Microsoft goes to war.”

    Markets are more than battlefields. The OR logic of war and sports get us excited, but tells us little of real substance. For that we also need the AND logic of cooperation, choice, partnership and working together. What we all want most — love — is hardly an OR proposition. Imagine a lover saying “there’s only room in this relationship for one of us, baby.”

    But the press is caught in an OR trance. Blind to the AND logic that gives markets their full color, the press reduces every hot story to the black vs. white metaphors of war and sports. Why cover the Web as the strange, unprecedented place it is, when you can play it as yet another story about two guys trying to beat the crap out of each other? Especially when the antagonists are little good guy and a big bad guy?

    Look, the Internet didn’t take off because Netscape showed up; and it wasn’t slowed down because Microsoft didn’t. It took off because millions of people added their creative energies to something that welcomed them — which was mostly each other. Death-fight competition didn’t make the Web we know now, and it won’t make the Web that’s coming, either.

    That’s because every site on the Web is AND logic at work. So is every vendor/developer relationship that ever produced a product or created a market. So is the near-infinite P/E ratio Netscape enjoys today.

    , what IS Microsoft doing?

    “Embrace and extend,” Bill Gates called it in his December 7 talk. That’s what he said Microsoft will do with products from Oracle, Spyglass, Compuserve and Sun. Is this an AND strategy? Or is it yet an other example of what Gary Reback, Judge Sporkin and other Microsoft enemies call a “lock and leverage” strategy, intended to drive out competition and let Microsoft charge tolls to every traveler on the Information Highway?

    We’ll see.

    It should be clear by now that the Web does not welcome OR strategies. Microsoft Network was an OR strategy, and it didn’t work. If history repeats itself (as it usually does with Microsoft), the company will learn from this experience (as Apple learned earlier from its eWorld failure) and move on to do the Right Thing.

    Not that most of the press would notice. To them Microsoft is The Empire and Bill is its gold-armored emperor. But reporters are the ones putting clothes on this emperor. To the people who make Microsoft’s markets — the users and developers — “billg” is as naked as a newborn.

    Take away the war-front headlines, the play-by-play reporting, the color commentary by industry analysts, the infatuation with personal wealth — and you see Bill as an extremely competitive guy who’s also trying to do right by users and developers. And hiding little in the process. Is he a bully? Sometimes. Is this bad? No, it’s typical of big companies since the dawn of business. It looks to me more like a personality trait than a business strategy. And what makes Microsoft win is far more strategic than personal.

    George Gilder puts it this way in Forbes ASAP (“Angst & Awe on the Internet“):

    Blinded by the robber-baron image assigned in U.S. history courses to the heroic builders of American capitalism, many critics see Bill Gates as a menacing monopolist. They mistake for greed the gargantuan tenacity of Microsoft as it struggles to assure the compatibility of its standard with tens of thousands of applications and peripherals over generations of dynamically changing technology.

    to win users and influence developers

    How does Bill express that tenacity? As Dave Winer puts it in “The Platform is a Chinese Household,” Bill “sends flowers.” Bill courts developers and delivers for customers, who return the favor by buying Microsoft products.

    Markets are conversations, and there isn’t a more willing conversational participant than Bill. That’s why I’m not surprised when Dave says “the only big company that’s responsive to my needs is Microsoft.” And Dave, by the way, is a pillar of the Macintosh community. To my knowledge, he hasn’t developed a DOS-compatible product since the original ThinkTank.

    Users and developers don’t need to hear vendors talk about how much their competition sucks. No good ever comes of it. Is it just coincidence that Microsoft almost never bad-mouths its competition? Though Bill is hardly innocent of the occasional raspberry, he’s a long way from matching the nasty remarks made about him and his company by leaders at Sun, Apple, Netscape and Novell, just to name an obvious few.

    It especially saddens me to hear competition-bashing from Guy Kawasaki, whose positive energies Apple desperately needs right now. As a customer and user of both Apple and Microsoft products, I see Guy’s “how to drive your competition crazy” rap as OR logic at its antiproductive worst.

    At the opposite end of the diplomacy scale, I like the way Gordon Eubanks of Symantec has consistently been fair and constructive in his public remarks about Bill and Microsoft (and has reaped ample rewards in the process).

    What makes markets work is a combination of AND and OR processes that deserve thoughtful and observant journalism. They also call for vendors who can drop their fists, open their minds and look at opportunities from users’ and developers’ points of view. This is how Microsoft came to change its Internet strategy. And this is what makes Microsoft the most adaptive company in the business, regardless of size. No wonder the laws of Darwin have been kind to them.

    new breed of life

    Urge and urge and urge,
    Always the procreant urge of the world.
    Out of the dimness opposite equals advance…
    Always substance and increase,
    Always a knit of identity… always distinction…
    Always a breed of life.
    —Walt Whitman

    Where the language of war fails, perhaps the language of Whitman can succeed.

    By the great poet’s lights, the Web is a new breed of life. An original knit of identity. Its substance increases when opposite equals like Netscape and Microsoft advance out of the dimness and obey their procreant urges — not their will to kill.

    The Web is a product of relationships, not of victors and victims. Not one dime Netscape makes is at Microsoft’s expense. And Netscape won’t bleed to death if Microsoft produces a worthy browser. The Web as we know it won’t be the same in six weeks, much less six months or six years. As a “breed of life,” it is original, crazy and already immense. It is not like anything. To describe it with cheap-shot war and sports metaphors is worse than wrong — it is bad journalism.

    A week after this experience, I went back to Microsoft site and found its whole Internet Strategy directory much more Netscape-friendly and nicely organized. Every presentation is there, including all the slides. Though the slides are in PowerPoint 4.0 for Windows, my Mac is able to view them with the Mac version of the program. [Back to *]

    George Gilder’s Forbes ASAP article archives are at his Telecosm site.

    Dave Winer’s provocative “rants” come out every few days, and accumulate at his DaveNet site. Check out “The User’s Software Company,” which inspired this essay.


    One might look back on this and say “Yeah, but Microsoft still killed Netscape.” I don’t think so. Netscape had many advantages, including one it tried too late to save the company — but not too late to save the browser and keep it competititve: open-sourcing the Mozilla code. Five years after I wrote the above, I wrote a piece in Linux Journal describing Netscape’s mistakes:

    For a year or two, Netscape looked like it could do no wrong. It was a Miata being chased down a mountain road by a tractor trailer. As long as it moved fast and looked ahead, there was no problem with the truck behind. But at some point, Netscape got fixated on the rear-view mirror. That’s where they were looking when they drove off the cliff.

    Why did they do that?

    1. They forgot where they came from: the hacker community that had for years been developing the Net as a free and open place—one hospitable to business, but not constrained by anybody’s business agenda. The browser was born free, like Apache, Sendmail and other developments that framed the Net’s infrastructure. The decision to charge for the browser—especially while still offering it for free—put Netscape in a terminal business from the start.
    2. They got caught up in transient market’s fashions, which were all about leveraging pre-Web business models into an environment that wouldn’t support them. Mostly, they changed the browser from a tool of Demand (browsing) to an instrument of Supply. They added channels during the “push” craze. They portalized their web site. They turned the location bar into a search term window for a separate domain directory, to be populated by the identities of companies that paid to be put there (a major insult to the user’s intentions). Worst of all, they bloated the browser from a compact, single-purpose tool to an immense contraption that eventually included authoring software, a newsgroup reader, a conferencing system and an e-mail client—all of which were done better by stand-alone applications.
    3. They became arrogant and presumptuous about their advantages. At one point, Marc Andreessen said an OS was “just a device driver”.
    4. Their engineering went to hell. By the time Netscape was sold (at top dollar) to AOL, the dirty secret was that its browser code was a big kluge and had been for a long time. Jamie Zawinski (one of the company’s first and best-known engineers) put it bluntly: “Netscape was shipping garbage, and shipping it late.” Not exactly competitive.
    5. They lost touch with their first and best market: those customers who had actually paid for that damn browser.

    So, back to the original question. What have we learned, now that IE is still around, and most of its competitors are either open source or based on open source code? Here’s a quick list:

    1. The browser was never a product in the sense that it’s something that can be charged and paid for as a scarce good. It wanted to be open source in the first place.
    2. The war metaphor is distracting and misleading, even when it’s appropriate.
    3. No browser is even close to perfect, and none will ever be.

    Feel free to add more of your own, here or on Quora. (I’m very curious to see how Quora evolves.)

  • What station(s) does KDFC pave in the South Bay?

    So now KDFC is on 90.3 and 88.9, while KUSF is off the air. (Though it does have a Live365 stream.) Radio Valencia, a pirate radiating out of the Mission district on 87.9, has expressed sympathy with KUSF’s exiled volunteers, and has provided some airtime as well. The University of San Francisco, which sold the 90.3 license to the University of Southern California, currently has KUSF.org re-directing to this 9-day-old press release.

    In my last post I suggested that KUSF’s volunteers apply for 87.7 as a licensed low power TV station. (As fate has it, the audio for Channel 6 TV is roughly on 87.7). I had forgotten about Radio Valencia when I wrote that. Perhaps the two groups can get together and go after 87.7, if that window is actually open.

    The KUSF community (at SaveKUSF.org) remains committed to getting their frequency back. The likelihood of this rounds to zero, but I wish them luck. (They’re having some with SF supervisors.) I still think the future of radio is over the Net in any case. Going forward in that direction, a big question for KUSF’s community is how it can keep dealing with USF, which will provide the streaming, the studio, the record library and other essentials, such as the KUSF brand, which is the university’s intellectual property. I’ll be interested in hearing how that non-divorce works out.

    Meanwhile there is the matter of expanding KDFC. On KQED’s Forum last week, Brenda Barnes, president of USC radio (which bought KUSF’s license is moving KDFC there) and managing director of the Classical Public Radio Network (which will operate KDFC locally), said many times that her organizations are looking to buy a signal, or signals, in the South Bay, where KDFC can’t be heard from either of its new facilities (the old KUSF on 90.3 and the old KNDL in Anguin on 89.9).

    It could be that the USC people are also already thinking about 87.7 (the Channel 6 TV hack) in the South Bay. If that radiates from one of the mountains down there, it would do a good job. (The signal would be weak, but reach far, kind of like KFJC does now). That would be the best solution, I think; but it would also foreclose the 87.7 option for KUSF-in-exile, essentially screwing them over a second time. (So, there’s an assignment for both KUSF and Radio Valencia. Hurry up and see what can be done.)

    The more likely option for KDFC is finding a college or university that would rather have money than continue operating a radio station, especially when a buyer comes calling. That’s the option USF took, and it’s a certain bet that Brenda Barnes and friends are already hard at work selling the same options to one or more of these FMs in the South Bay:

    • 89.1 KCEA Atherton, owned by Menlo-Atherton High School. Broadcasts with 100 watts  from a ridge  San Carlos. Small signal.
    • 89.3 KOHL Fremot, owned by Ohlone Community College. Covers the eastern part of the South Bay with 145 watts from the college campus in the foothills.
    • 89.7 KFJC Los Altos, owned by Foothill Junior College. Covers the South Bay well, from Black Mountain, with just 108 watts. This is the KUSF of the South Bay, and the station/community with the most to worry about.
    • 90.1 KZSU Stanford, owned by Stanford University. Covers Palo Alto and the central Peninsula with 500 watts from a hill on The Farm. KDFC’s 90.3 signal in San Franciso protects KZSU with a null in the direction of Stanford. The option here for the KDFC folks would be to buy KZSU and turn it into a KDFC repeater, or to take it dark and crank up the San Francisco signal. But then, there’s also…
    • 90.5 KSJS San Jose, owned by San Jose State University. This too has a commuity. And it covers the San Jose end of the South Bay well with 1500 watts on a high hill on the south side of town. 90.3 in The City also protects KSJS, so the same options for KDFC apply here as with KZSU.
    • 91.1 KCSM San Mateo, owned by the College of San Mateo. This is the Bay Area’s much-loved jazz station, and covers the Peninsula and Mid-South Bay pretty well, plus Oakland-Berkeley. Wattage-wise, it’s the most powerful of the options (11,000 watts), though the transmitter is not on a high site.
    • 91.5 KKUP Cupertino, owned by the Assurance Science Foundation. With 200 watts on Loma Prieta Mountain, KKUP reaches a large area, including all of Monterey Bay (Santa Cruz, Salinas, etc.) as well as the south part of the South Bay.

    Another possibility for KDFC is buying a commercial station in the South Bay. There are many of those to choose from, if any is willing to sell. None will be cheap, but most would be better than the options above, with the conditional exceptions of KCSM and KFJC. For example, KCNL on 104.9, which Clear Channel unloaded last year for $5 million, would have been a good deal for the USC people. It serves the South Bay quite well with a 6,000 watt signal from the foothills near San Jose. KRTY from Los Gatos on 95.3 is another one with a similar-sized signal.

    In any case, we know who is on the hunt and why. If they succeed, KDFC listeners should be happy. Listeners to the replaced station, or stations, will not be. Looking at the ratings, I am betting that there are more of the former than the latter. In the most recent rating period, KDFC was Number 7 overall (out of many dozens of signals), with a 3.9% share of Average Quater Hour listening, which is great for any station and huge for a classical one. It also had a cumulative audience of 632,000 people, none of which can get the station today on the signal they listened to during that ratings period.

    [Later…] A february 10 post at RadioSurvivor.com.

  • Suggestion for KUSF: go to 87.7

    87.7 is a frequency that has been open on FM since TV’s digital transition in 2009, which cleared most TV signals off of channels 2-6. (Digital TV stations now identify as “virtual” channels. KRON/4, for example, actually radiates on Channel 38). The audio signal for the old Channel 6 is at approximately 87.7, and it’s cool for a low power TV station to broadcast there and to bypass video altogether, or close enough.

    In other words, you might be able to get an FM station going on 87.7 through a license to operate a low power TV station on Channel 6. That’s what WNYZ-LP does in New York and KSFV-LP (which  operates as Guadalupe 87.7) does in Los Angeles. And it’s what KXDP-LP (ESPN sports) does in Denver.

    The FCC says here that applications can be made during “windows” of time, and points to a page that says nothing about that. So, do some digging. Could be the option is closed, but … might not be.

    KQED might object, even though 87.7 is four channels away from KQED’s 88.5.  So might 17-watt KECG in El Cerrito or 7-watt KSRH in San Rafael, both of which broadcast on 88.1. Or 10-watt KSFH in Mountain View on 87.9. NPR might object too, given its ongoing opposition to the practice of operating an “ersatz” TV station just to put a radio signal on 87.7fm. But they also might not care.

    Operating a pirate on that channel is also an option. It’s not a legal one, but it seems to fly as long as nobody objects. “Hot 97” in Boston has been going since 2009 at 87.7, showing up shortly after WLNE-TV in Providence/New Bedford abandoned Channel 6 (it’s now on Channel 49). Hot 97’s power isn’t published, but I’ve seen reports saying it’s 5,000 watts. I wouldn’t be surprised, since it’s bigger than many of the noncommercial signals in town, and nearly competitive with the commercial ones. (I first wrote about it here.)

    While I wish the KUSF community well in its fight against USF and USC (and maybe also Entercom and the FCC), I think the odds of getting 90.3 back are between slim and none. The best option is to explore other ones.

  • KDFC wounded, KUSF killed (almost)

    This week the Bay Area loses two of its radio landmarks. On 102.1fm, , which has been broadcasting classical music since 1946, will be replaced by a simulcast of (“K-FOX”), a classic rock station in San Jose. And on 90.3 fm, KUSF, which has been one of the most active and community-involved free-form college radio stations in history, has gone silent. When the signal on 90.3 comes back on the air, it will carry the KDFC call letters and classical music programming. Meanwhile the old KUSF will continue in some form online. The new KDFC will also broadcast on 89.9, which is the former home of , a station licensed to .

    This graphic, combined from three coverage maps at Radio-Locator.com, shows the before-and-after situation. One red line is KDFC’s old primary coverage area on 102.1. The other two are its new primary coverage areas on 90.3 and 89.9:

    (More about signals below at *)

    Since the 90.3 signal is tiny, and the 89.9 signal is far away, KDFC will be losing a great deal of coverage. Neither of the new signals serves the Peninsula, the South Bay or the East Bay beyond Berkely and Oakland. KUSF needs to start over online. On the FM band, it’s dead.

    What happened was a three-way deal between , the and the . Entercom is the one of the largest owners of broadcast properties in the country, and an aggressive buyer of broadcast properties. So is USC, which has expanded its classical network from in Los Angeles to five stations spread from Morro Bay to Palm Springs. USF, like many universities, held a broadcast license that had monetary value on the open market while producing no income for the university itself.

    According to Radio Ink and other sources, here’s how the deal went down:

    1. USF sold the 90.3 frequency to USC for $3.8 million.
    2. USC also bought KNDL for $2.8 million.
    3. Entercom, which owns KDFC, bought KUFX from the Clear Channel Aloha Trust, and will simulcast KUFX (still as “K-FOX”) over KDFC’s old 102.1 facility. Entercom will also give KDFC’s call letters and record collection to “A new San Francisco-based nonprofit.”

    The press releases:

    While it’s nice that KDFC has stayed alive, its move to much weaker signals is a far bigger loss for Bay Area classical music listeners than losses suffered by listeners when New York’s WQXR and Boston’s WCRB made similar moves. WQXR stayed on the air with a smaller signal from the same antenna, and WCRB moved to a same-size transmitter a couple dozen miles from the center of town, but most listeners could still get the stations. KDFC’s new facilities only cover a fraction of the population reached by the old signal. Essentially the new station covers San Francisco, and that’s it. More about coverage below*.

    KDFC’s listenership is not small. The raw numbers are actually outstanding. According to Radio-Info.com (which leverages Arbitron), KDFC had 632,000 listeners in the most recent ratings period (December 2010), a notch above news-talk leader KGO (624,100). KDFC’s 3.2 average quarter hour (AQH) share was tied for #8 in the market, one notch above “sports giant” KNBR, which scored a 2.8. (KGO was #1 overall for most of the last six decades, and KNBR is an AM powerhouse that covers at least half of California by day and the whole West at night.) In fact, KDFC had better overall numbers than any other Entercom station in the Bay Area.

    The problem for Entercom was the format. It’s hard to sell advertising for classical music stations, which have less inventory to offer (sports, news and popular music stations carry many more minutes of advertising per hour), and serve an older audience as well.

    Judging from the KDFC statement on its website The Classical Public Radio Network () will hold the license, even though it closed down a few years ago, sort of. It also says,

    The new KDFC has already begun to look for new signals to offer reception in the South Bay and the entire Bay Area for our around-the-clock classical programming.

    We are happy to let you know Dianne Nicolini, Hoyt Smith, Rik Malone, and Ray White will continue as your on-air hosts, and KDFC’s partnerships with the Bay Area arts and culture community will continue to grow and thrive.

    KDFC is the last major commercial classical station in America to make the transition to public radio. This move ensures that classical radio is sustainable for our community into the future. Since 1947, Bay Area classical fans have shown their passionate support for KDFC. Now more than ever, we’re grateful for that support as we begin the new era of Classical KDFC. Comments can be made to comments@myclassical.org, or by phoning 415-546-8710. If you’d like to send a check as a Founder for the Future of KDFC, please send a check to:

    The Classical Public Radio Network, 201 Third Street, 12th floor, San Francisco, CA 94103.

    It’s signed by Bill Leuth, Vice President, KDFC. Bill and the other names he mentions are Bay Area classical radio institutions as well.

    As for KUSF, maybe going online will be a form of liberation. As signals go, 90.3 barely covered San Francisco. The Internet covers the world. And Internet radio is growing fast. Aribitron now includes online streams in its ratings, which it wouldn’t do that if those streams were not signifiant. In San Francisco, KNBR’s stream had more than 50,000 listeners in November. In Los Angeles, KROQ’s stream had 67,900 listeners in December. Many more people every day are listening to radio on phones and other portable devices. Even Howard Stern, when he renewed with Sirius in December, said the future of satellite listening isn’t over satellite — it’s over the Internet. (Which Jeff Jarvis and I both told him, back when he was still making up his mind. Latelr Howard kindly gave a hat tip to Jeff on the air.)

    And hey, KDFC can benefit from the same thing.

    Here’s more from The Bay Citizen and the San Francisco Chronicle. And a rescue mission report at SF Weekly… And here’s the audio from a KQED Forum program on the matter. It says that KUSF is slated to become “an online-only training station for students.] Here’s a San Francisco Chronicle story on a gathering at USF at which “almost 500 backers” of KUSF came to confront Stephen A. Privett, the University President. The part that matters:

    Privett said he made the decision because the station, dominated by outside volunteers, “was of minimal benefit to my students.”

    “This was not a crass business decision about dollars,” Privett said. “This was about ensuring our programs involve our students. … Our primary mission is to our students, it is not to the community at large.”

    Privett said some of the $3.75 million would be used to fund the student-led online station, with the rest going to other unspecified educational projects.

    Well, “student-led” suggests that the community might still be involved.

    For frequent updates follow @KUSF. and at SaveKUSF on Facebook. Feelings are not weak on this matter. KUSF is much loved by its community.

    On January 20, I put up a new post suggesting that the KUSF community go for 87.7fm. I think it’s available.

    It also amazes me (it’s still January 20) that this post and the next one have not yet received a single comment. Meanwhile my earlier post about Flickr now has 86 comments, and even the highly arcane Geology by Plane has 6. Could it be that the total number of people who care just isn’t that large? Not saying this is a bad thing, just that it’s an isolated one. So far 3,384 people say they like SaveKUSF on Facebook. But liking and doing are way different. As I suggest here, the best bet for doing isn’t trying to make a university turn down $3.8 million for something they clearly wish to unload. It’s to start something new.

    * Signal stuff, for the technical:

    Update on 15 November, 2025:

    • KDFC/90.3 is now on Mt. Beacon with 1000 watts at 988 feet above average terrain, which gives it a much bigger signal than it had as WUSF. Here is a coverage map.
    • KOSC is now what KDFC on 89.9 in Angwin is now called atop each hour. Transmitting with 800 watts from atop Mt. St. Helena, it sounded fine to me in the South Bay when I was there a few weeks ato.
    • KXSC/104.9 is KDFC for the South Bay, transmitting with 6000 watts from the foothills east of San Jose.
    • KDFG/103.9 is KDFC for Monterey Bay, transmitting with 1500 watts on a Monterey peninsula hill.
    • K223AJ/92.5 is KDFC for Ukiah-Lakeport, with just 10 watts, but atop North Cow Mountain at 3589 feet.
    • Here is a combined coverage map.
    • KDFC is a strong #3 in the market, with a 5.7 share in the latest ratings.

    So classical still rocks in the Bay Area.

  • Geology by plane

    I’ve been looking gratefully and often, over the past few years, at Louis J. Maher, Jr.’s . The shots themselves date from 1956-1966, and he put the page up in 2001; but their subjects are the sort that don’t change much over a span of time so short as the last thirty-five years. Dr. Maher is an Emeritus Professor in the Department of Geoscience at the University of Wisconsin-Madison, and specializes in the Quarternary Period, which also happens to be the one in which we  live. (More specifically, we operate in the Holocene epoch, which is the name geologists give to the last dozen millennia or so.)

    Explains Dr. Maher,

    I was working on a closed-circuit educational television class in geology in 1966. A problem arose while I was planning the outline of some 43 lectures. Many of the available photographs and films that I wanted to use were copyrighted. Although they could be shown free to normal classes, royalties were required once they were put on video tape. I decided to solve the problem by getting a couple of cameras and spending a month in the West filming my own material. Then a happy thought occurred to me. Why not take some of the pictures from the air? I had earned a private pilot rating in 1964 and had logged about 90 flight hours. It happened that the Geophysics Section of our Geology Department owned a Cessna 170B that had been purchased for aeromagnetic research. At the time N2398D was sitting empty at a local airport, and the University of Wisconsin agreed to absorb 100 hours of flight time for the project. Graduate student and project assistant Charles F. Mansfield indicated he was willing to come along as photographer; I could not have found a more able colleague.

    I have used the color film taken during the flights of 1966 long after the black and white video tapes were discarded, and I have added to the collection over the years. While it is important to have detailed ground-based slides to illustrate geological features for introductory classes, a few shots from the air help to establish their overall relationship.

    These air photos have been very useful in my teaching. I think they can be useful teaching aids for others. I have copyrighted the digital image files, but I am making 360 of them available at no cost for noncommercial educational use.

    That’s also the idea behind all the photos described or tagged geology in my Flickr photo collection. Only a few of those were taken by lightplane. (Those are in this set of the San Andreas Fault, in the Carrizo Plain of California. The pilot was @DougKaye) The rest were all shot from heavyplane, at altitudes of up to forty thousand feet and more. (I think the highest was this one.) Some were shot from the ground, such as during this cross-country road trip. Here’s one sample, from a flight over Greenland:

    So here’s a belated thanks to Dr. Maher for his generosity. As did he, I grant permission to anybody teaching or learning geology to use any of my shots, any way they please. All of them should be CC licensed to permit that. If you find any that aren’t, let me know and I’ll fix them.

  • Pubs: Quit adding promo BS to copied text

    So I when I copy the headline “Thousands of Web Users Delete Profiles from Rapleaf” I get more than I asked for when I copied it. This I find out when I paste it, and get the the headline, plus “Read more: http://online.wsj.com/article/SB10001424052702304248704575574653801361746.html#ixzz1Ay7eL3K

    The extra jive after “…html” is tracking stuff, I guess. I don’t know, and I don’t want to know. I also don’t want to deal with it. I want to copy what I see and nothing more. That’s the convention that’s been around since the dawn of text, and it works fine.

    The Journal isn’t the only pub that does this, but it’s one I’m dealing with right now.

    So, on behalf of users everywhere, I ask, Please: Stop it.

  • What if Flickr fails?

    [21 May 2025 update… This post is suddenly getting a lot of visits. I don’t know why, but I would like to note that Flickr has failed to fail through more than fourteen years since this post went up. I’d also like to thank Flickr for doing a great job of hosting both my own pile of 82,213 photos and the additional 5,449 photos in my Infrastructure collection. Thanks to permissive Creative Commons licensing on Flickr, more than 4,700 photos from those collections have found their way onto Wikimedia Commons, and from there into countless Wikipedia pages. Many more appear in news stories.]

    [2 February 2011 update… A new case has come up, of accidental deletion. More details here and here. The company has also updated its community guidelines. It’s still not clear why the company does not save deleted accounts. My provisional assuption is that the reason is legal rather than technical. But I’d love to hear somebody from Flickr (or somebody familiar with their systems) tell me that’s wrong. In any case, deleted accounts should be kept, somewhere, somehow, one would think.]

    As of last October, hosted 5,000,000,000 images. I’m approaching 50,000 images on Flickr right now. Sooo… if I lop off a bunch of zeros that comes to… .001% of the total. Not much, but maybe enough to show on their radar.

    Here is what I hope they see: some heavy Flickr users are getting worried. Those with the most cause for worry are at the ‘pro’ level, meaning we pay for the service. (In my case, I pay for two of the four at links above). One cause for worry is reports of sudden and unexplained account deletions. The other is the possibility that Flickr might fail for the same reason that, say, is now failing. That is, by declining use, disinterest or mismanagement by the parent corporation, or a decline in advertising revenues.

    Of particular interest right now is a report by of Deepa Praveen’s Flickr Pro account deletion. She claims she lost 600 photos, 6,000 emails, 600 contacts, 20,000 favorites, 35,000 comments, 250,000 views and more. “Don’t I deserve a reason before they pressed the DEL key?” she writes.

    Of course we only have her side on this thing, so far, so bear that in mind.

    Meanwhile the closest thing I can find to an explanation in Flickr’s Help Forum is this thread, which leads me to think the most likely reason for the deletion is that Deepa voilated some term of service. But, I dunno. Maybe somebody from Flickr can explain in the comments below.

    Still, even if blame for the deletion ends up falling at least partly on Deepa (which I hope it does not, and have no reason yet to think it should), one’s exposure on Flickr goes up with the sum of photos one puts there. And the greater risk is not of Flickr’s deletion of customers, but of the market’s deletion of Flickr. Because, after all, Flickr is a business and no business lasts forever. Least of all in the tech world.

    Right now that world looks to advertising for paying many big Web companies’ bills, and for driving those companies’ valuations on Wall Street and in pre-IPO private markets. Some numbers… The online advertising business right now totals about $63 billion, close to half of which goes to Google. In fact the whole advertising business, worldwide, only comes to $463 billiion. (Sources: and Google Investor Relations.) That’s a lot of scratch, but does that alone justify the kinds of valuations that and are getting these days? A case can be made, but that case is a lot weaker if Facebook and Google remain mostly in the advertising business. Which, so far, it looks like they will.

    Wall Street is less enthusiastic about , but still a little upbeat, perhaps because advertising is still hot, and Yahoo still makes most of its money from “marketing services.” Flickr is part of Yahoo. I can’t find out how much Flickr brings in, but I’m curious to know what percentage comes from Pro account subscriptions, versus advertising placed on non-pro account pages.

    There are cracks in the edifice of the online advertising. This comScore report, for example, and an earlier one, both show that ‘natural born clickers’ (that is, people who like to click on ads, versus the rest of us) account for a huge percentage of all the clicks on advertising, which pays based on “click-throughs”. Chas Edwards says, “these ‘natural born clickers’ are not the most desirable demographic for most advertisers: They skew toward Internet users with household incomes below $40,000 who spend more time than average at gambling sites and career advice sites.”

    Among all the revenue diets a company might have, advertising equates best with candy. Its nutritive value is easily-burned carbohydrates. A nice energy boost, but not the protien-rich stuff comprised of products and services that provide direct benefits or persistent assets. (I can hear ad folk’s blood begin to boil here. “Advertising is nutritive! It delivers lots of positive public and private good!” Please, bear in mind that I made my bones for many years in the advertising business. I co-founded and served as creative director for one of Silicon Valley’s top agencies for many years. My name was on a building in Palo Alto when I did that. I know what the candy is, how it’s made, how easily most companies who use it can get along without it, and how it differs from stuff they can’t get along without.*)

    Regardless of whether or not you think the online advertising business is a bubble (which I do right now, but I’m a voice in the wilderness), we should face the fact that we are seriously exposed when we place our businesses and online lives in the hands of companies that make most of their money from advertising, and that aren’t diversifying into other businesses that aren’t based on guesswork.

    I just got off the phone (actually Skype) with folks working on a project that examines Facebook. Many questions were asked. Rather than repeat what you’ll hear me say when that show is produced, I’d rather point to one example that should prove at least some of my points: MySpace.

    What’s to stop another company from doing to Facebook what Facebook did to MySpace? More to my point, what’s to stop some new owned-by-nobody technology or collection of protocols and free code from doing to Facebook what SMTP, POP3 and IMAP (the protocols of free and open email) did to MCI Mail, Compuserve mail, AOL mail, and the rest of the closed mail systems that competed with each other as commercial offerings? Not much, frankly.

    So I think we need to do two things here.

    First is to pay more for what’s now free stuff. This is the public radio model, but with much less friction (and therefore higher contribution percentages) on the customers’ side. In  (at the ) we’re working on that with . Here’s a way EmanciPay will help newspapers. And here’s our Knight News Challenge application for doing the same with all media sources. You can help by voting for it.

    Second is to develop self-hosted versions of Flickr, or the equivalent. Self-hosting is the future we’ll have after commercial hosting services like Flickr start to fail. Fortunately, self-hosting is what the Web was meant to support in the first place, and the architecture is still there. We’ll have our own Flickrs and Zoomrs and Picassas, either on servers at home (ISP restrictions permitting) or in a server rack at the likes of RackSpace. But somebody needs to develop the software. has been working in this direction for years. Flickr Fan being one example. The end point of his work’s vector is Silo-free everything on the open web. We are going to get there.

    Fortunately Flickr has a generous API Garden that does allow the copying off of most (or all) data that goes with your photographs. I’m interested in being able to copy all my photos and metadata off into my own self-hosted system. How much they would welcome that, I don’t know. But their API is certainly encouraging. And I do want them to stay in business. They’ve been a terrific help for me, and many other photographers, and we do appreciate what they’ve done and still do. And I think they can succeed. In fact, I’d be glad to help with that.

    But mainly I want them, and every other silo out there, to realize that the pendulum has now swung full distance in the silo’d direction — and that it’s going to swing back in the direction of open and distributed everything. And there’s plenty of money to be made there too.

    I think they might also consider going all-pro or mostly-pro. I say that because I’m willing to pay more than I do now, for a serious pro account — meaning one in which I have more of a relationship with the company. When the average price of first-rate cameras and lenses each run well into four figures, paying, say, $100+ per year for hosting of photos and other value-adds isn’t a bad deal. Hell, I used to pay that much, easy, per month, for film processing, back in the last millennium. And I did most of that at Costco.

    So here’s hoping we can talk, that Deepa can recover what she’s lost (or at least see a path toward something better than the relationship she had with Flickr), and that the entrepreneurs and VCs out there will start seeing value in new open-Web start-ups, rather than the ad-funded and silo’d ones that are still fashionable today.

    [Later (28 January)…] Thomas Hawk reports,

    …after getting three previous non-answer emails from them over the past few weeks, this morning they seem to have finally given her an official answer on why her account was deleted.

    From Flickr:

    Hi there,

    Like I said before, we saw behavior in your account that
    went against our guidelines and required us to take action –
    which was to delete your account. Our guidelines apply to
    any and all content you post on Flickr – photos you upload,
    comments you make, group discussions you participate in,
    etc.

    I am afraid I cannot give you any more specific information
    than this.

    Thank you for your understanding,
    Cathryn”

    The only problem is though, according to Deepa she said she hasn’t participated in any discussions or group threads in Flickr for over a year. And she felt that her content very much adhered to the Flickr Guidelines.

    I assume that Cathryn had no answer, and that this was the best Flickr could do.

    I would like to say this is unacceptable, except that it is acceptable. We accept it when we click “accept” to Flickr’s terms of service when we take out an account with them. And Flickr is no exception here. ALL websites and services like Flickr’s have similar terms.

    And we can’t expect the sites to fix them. We have to do that, by proffering our own terms.

    Which we’re working on. Stay tuned.

    *I actually have hopes for advertising — not as the super-targeted, quant-driven, “personalized” stuff that’s all the rage these days; but as a new communications mechanism on the corporate side of real conversational marketing, in which the customer has full status as a sovereign individual, and takes initiative, expresses intentions, and engages through mechanisms he or she controls (and preferably also owns).

  • How about a timed phone un-silencer?

    I’m sure all of us with mobile phones do the same thing. When we go into a meeting, a movie, chruch or whatever, we silence our phones. And then forget to un-silence them when we’re done. Then, after too much time has passed, we remember — or are reminded by means other than the phone, such as a spouse saying “Why didn’t you answer when I called? — that we’d turned it off.

    So I suggest an un-silencer option. You would set the silencer to snooze for one, two, three or some other number of hours, and then return to normal.

    Maybe some phones have this already.

    Yes, I know that on some phones, such as the i, the silencer is a physical slider. But it can still be done in software on phones that allow it.

    And yes, I know this is a trivial issue, but it’s how I’m dealing now with three missed calls.

  • How should we pronounce 2011?

    Is it “twenty eleven” or “two thousand eleven”?

    I’m hearing more of the former, I think. By that I mean “twenty eleven” is more commonly used than was “twenty ten,” an the “thousand” thing is wearing off.

    Sooner or later it will have to. I doubt we’ll be saying “two thousand thirty two” when 2032 rolls around. “Thousand” persisted through the ’00s (the “aughts”), but is getting a bit stale now that we’ve turned the caledar up to eleven.

    I haven’t bothered to check, but is one more correct than the other? Does the AP have a position on this, for example? Just wondering.