The Wayback Machine - https://web.archive.org/web/20090219012619/http://www.windley.com:80/archives/2003/02/

« January 2003 | Main | March 2003 »

February 28, 2003

A Few Mac Updates: Clicker and Konfabulator

I gave up on Clicker, at least for a little while. The trial ran out and I suspect it was making my machine unstable (at least Aqua). There are some reports on that and I started experiencing some weirdness after installing it. There's a new version out, but they want $10. I'll probably pick it up and try it out I'll let you know.

I also started playing around with Konfabulator at Colin Kelly's suggestion. Its pretty cool. Javascript and XML lead to interesting little applets. I like the clock, calendar, picture frame and weather applets. I didn't like the news aggregator applet which is what made me download it in the first place. It tries to hard to be "edgy" and in the process ends up being hard to use. Do I need these little applets? No. But they're cool eye candy.

Konfabulator is also shareware and has an annoying window that reminds you to pay and can't be dismissed or reduced. I'm waiting to pay until I play around with changing some of the scripts and actually programming the thing. I think the folks at Vultus ought to check it out.

5:46 PM | Comments () | Recommend This | Print This

February 27, 2003

XML for First Responders

Earlier, I wrote about XML for criminal justice. Today I found a reference to XML for first responders (they say emergency management). Here are some of the initiatives that look interesting:

  • Common Alerting Protocol which is billed as a standard method should be developed to collect and relay instantaneously and automatically all types of hazard warnings and reports locally, regionally and nationally for input into a wide variety of dissemination systems.
  • Automatic Crash Notification Initiative which would be used in a system like OnStar so that your car can notify the police when its been in an accident.
  • Emergency XML which would create an open XML-based standard for emergency management data exchange.

There are more listed, but these were the most interesting to me right now. These sorts of initiatives are going to make a difference because in the past, even though the technology was there to exchange this information, no one could solve the political questions of what format. Now the world (thanks to XML) is forcing people to develop standard interchange formats. The other side of the job is transport and while that has to be done well and right, its not nearly as political.

11:21 PM | Comments () | Recommend This | Print This

Our Network is for Selling Mops

I was telling this story to someone the other day and they suggested that I ought to write it down, so I decided it might make a good blog article.

In 1994, a friend of mine, Steve Fulling, was in Oregon building a statewide, high-speed network to connect the state's engineering schools at DS3 speeds (for you youngsters that was pretty fast in 1994). The project was called NeroNet. Steve talks about how they'd sit around the conference room hypothesizing things that people might do with a high-speed network. They came up with lots of lofty ideas: exchanging x-rays, doing weather simulations, doing physics experiments, distributed computations, etc.

At the same time, I was in Utah build an eCommerce site called iMALL.com with another friend named Ross Jardine. One of our first ideas was to create something we called "Deals of the Day." There was a company in Orem Utah that was kind of a 1980's version of an Overstock.com: they bought distressed merchandise and then liquidated it. But since there was no Internet, they did it by sending out faxes to thousands of small merchants who paid a monthly subscription fee to get access to the deals. We signed up to get the fax for few dollars a day and then put them on the net. Within months we were the company's largest distributor by far and Deals of the Day was off to a multi-year run as an iMALL.com staple.

That's where these two stories come together. In the beginning of October 1994, iMALL.com went live and Deals of the Day featured its first deal: a case of six Wonder Mops. I called Steve with some excitement and told him we were live and that he could now buy something on the Internet. It so happens, that he was just going into one of these brainstorming sessions on what to do with all the bandwidth NeroNet would deliver to the engineering schools of Oregon. He walked in, pulled up iMALL.com in Mosaic (bonus points if you know what that is) and told the assembled group of academics, "I know what our network is going to be used for: selling mops."

Of course, there were the expected guffaws and then they all pitched in and bought a case of mops---iMALL.com's first sale. If I remember right, the price for a case of mops was $36. Now, I don't know for sure that these mops were the first credit card purchase of a consumer good on the Internet, but if it wasn't the first, it was darn close. We've still got one of those mops (courtesy of Steve) and I've shown it off many times as the beginning of eCommerce.

9:57 PM | Comments () | Recommend This | Print This

February 26, 2003

SB 151: Utah's New CIO Act

The Utah Legislature is debating a new CIO statute for the State: Senate Bill 151. The bill clears up some long-standing problems which had plagued the CIO's office:

  • The IT Commission is abolished and a cross branch coordinating body called the Utah Technology Commission is established in its place. If the UTC can tackle the cross branch IT coordination problem, particularly with respect to policy, this will be a positive step.
  • The planning process is changed to be top-down (CIO creates strategic plan and agencies follow rather than the CIO merely aggregating agency plans).
  • Rule making and policy making authority are cleared up.
  • A whole new section on interagency cooperation has been added.
There's only two small problems that I see:
  • The bill increases the amount of work the CIO's office will have to do and there's not likely to be any increase in staff to support it.
  • The statute has no teeth at all. The CIO can plan all he wants, but there's nothing in the bill to require agencies to cooperate. Even the CIO's power to control purchases has been stripped. The CIO is now only allowed to "monitor" purchases.

Its likely that with these changes Utah has one of the most emasculated CIO statutes in the country. Something to celebrate, unless you're a taxpayer.

8:41 PM | Comments () | Recommend This | Print This

eBusiness Lecture

I gave my guest lecture at the Rollins Center today. Actually I gave the same lecture twice, once at 2pm and once at 4pm. Before the first lecture I was able to have lunch with some of the staff and students in the Center. The experience was very enjoyable. Here's a copy of my slides from the lecture.

8:09 PM | Comments () | Recommend This | Print This

February 25, 2003

HB 240 Update

HB 240 passed the house today 68 to 1! I'm very surprised. Many old hands thought this was going to be an uphill battle. Its not over yet; there's still the Senate and then the Governor. The word is that Sen. Valentine is the lead in the Senate and seems positive and supportive, after a number of meetings. The Governor's office has received more email on this one bill than any of the others. Send mail, faxes, and emails to your senator. And, of course, the Governor's office.

9:37 PM | Comments () | Recommend This | Print This

Automation as a Competitive Advantage

I spent the last two days with a working group of people from a number of companies looking to create a product (company) that provides more automation for enterprise application integration and the programming tasks associated with it. The great paradox of automation is that it leads to productivity gains and at the same time also increases quality because of greater repeatability. This has been and will continue to be a bitter pill for IT employees to swallow.

In that sense, they're no different than workers in other industries. The automobile workers resisted automation with everything they had until the Japanese, with a heavy dose of the very thing they were fighting, started to close down their plants. Then they realized that it was a choice between automation or not having an automobile industry in the US. The Japanese used our reluctance to automate as a competitive advantage. Automation gave them productivity, but the real gain was in the quality of their product---a property largely enabled because automation created repeatable processes.

Probably the best instance of this phenomenon in IT is desktop management. It won't come as any surprise to regular readers of this blog to know that I'm a big fan of desktop management in the enterprise. The promises of desktop management are the same as EAI automation: better productivity (read "reduced costs") and increased quality. The pushback that I always got in the workplace was IT workers whispering to anyone who would listen "sure, it might costs less, but you'll never get good service." In fact, every automation example you can find shows that, properly implemented, automation leads to better quality.

To get a little more specific, I estimate that the State of Utah could save $10-20 million per year through desktop management and other reforms in the way IT is managed in the state. That's significant money. I don't espouse making the changes and then hoping the savings arrive, rather the State ought to do a third party study of how IT is organized and how service is delivered. Unfortunately, employees are so scared of the automation and what it might mean to them that they have poisoned any discussion of automation with red herring arguments about other issues. I challenge the legislature to appropriate some money for a study by a reputable group and then follow their advice. You owe it to the taxpayers of the State.

What does this mean to your business? Competitive advantage doesn't usually follow from something that anyone can go out and buy. You can get some advantage in the short term (3-5 years) however, if your enterprise is able to make the move to automation in IT early while your competitors flounder in self-doubt and angst.

6:23 PM | Comments () | Recommend This | Print This

Scary

My computer scared me this morning. As I walked in my office, James Taylor suddenly started singing "Walking Man."

8:24 AM | Comments () | Recommend This | Print This

eBusiness Trend Keywords

I've been thinking about what I'm going to say tomorrow in my lectures at the Rollins Center for eBusiness. My original working title was "Life as a CIO" but that didn't really capture what I wanted to say. I've put a talk together that talks about current hotspots (as I see them) in information technology and their impact on the enterprise. Here are the things I'm going to mention. Any that I'm forgetting?

The idea of the talk is to not simply go over a list of cool things in IT but to talk about how these trends are all related. These trends can be described by some non-empty subset of the following keywords:

  • Personalized
  • Peer-based
  • Decentralized
  • Collaborative
  • Connected

Any suggestions on other important keywords to describe current important trends in IT and eBusiness?

8:21 AM | Comments () | Recommend This | Print This

February 24, 2003

Your Phone as a Proxy for Presence

One of my pet feature requests is the ability to use my Bluetooth enabled phone to indicate presence. The idea is simple. Rather than having to manually click on "available" or "checkback later" in iChat, I want it to select automatically based on whether or not my Buetooth-enabled T68i phone is near-by. Since I always have it with me, its a convenient proxy for my presence. Someone has recently solved this problem, or at least the hard part.

Today, Will Cox points me to Sony Ericsson Clicker a handy program that let's you use a Bluetooth phone to control a powerpoint slideshow, iTunes, or any applescriptable application. Being relatively new to the Mac, I'm not that familiar with AppleScript, but presuming that iChat has AppleScript hooks, my dream could become a reality.

I downloaded the preview release of the software (which is going to stop working on Mar 1, 2003 according to the web site) and installed it on my PowerBook. The instructions are clear and work great. One note: you may not have a PreferencePanes folder in your personal library (~/Library/PreferencePanes); just create it. I fired it up and in minutes was controlling a PowerPoint presentation. This is way cooler than the KeySpan device I currently use to control PowerPoint remotely for one simple reason: Convergence!

The next little experiment was really too cool. Clicker can take action based on presence, so I set it up to pause iTunes when I leave the room and start playing when I return. Worked like a charm. My computer now senses my presence and takes the action I've selected. This is totally the right thing. Apple needs to make this part of OSX.

Now for the bad news. iChat is apparently not scriptable. I was able to activate it from AppleScript, but not set my status. This program can set the status using an internal, undocumented API. Ugh. I installed it and now the title of whatever's playing in iTune is displayed as my "status" in iChat. So, with both programs, you get presence in iChat from a T68i phone: if my status is "Quiet in iTunes" then my phone is not near my computer. :-)

Come on Apple! Let's get AppleScript support in iChat! There's coolness there.

6:16 PM | Comments () | Recommend This | Print This

HB 240: Fund of Funds

I've written before about Utah House Bill 240. HB240 would create a fund of funds for Utah's venture capital community backed by contingent tax credits. The article in the Deseret News discusses some of the actions and reactions related to the bill. This bill is important to the high tech community in Utah and it needs your support. Write or call your legislators and send an email to the Governor's office. Let them know you support the bill and ask them to.

Update: HB 240 passed out of committee this afternoon which means it will be considered by the full House next. This is the critical phase where your input can make a difference. The committee is mostly made up of the converted. The skeptics are your representatives who aren't on the committee.

9:35 AM | Comments () | Recommend This | Print This

February 21, 2003

Being Smart About Business Intelligence

Many companies have achieved considerable success in using BI (business intelligence) tools. Wal-Mart, General Electric, and Cisco have all expended huge sums on BI solutions, and give these systems a great deal of credit in helping them successfully manage their business. Siebel Systems, by virtue of tight controls on processes and doing things right from the start, has also created an internal BI system that is a model for what many companies are trying to do. [Full story at InfoWorld.com...]

I've started to write an occasional article for InfoWorld. I'm excited to be able to write about enterprise computing issues and the chance to see some interesting, new things. Jon Udell turned me on to this opportunity and I'm grateful. The figure at the right is a representation of the hierarchy of IT needs I mention in the article. This is similar to the concept I layed out in the Road to the Future document I prepared for the IT Commission last November. I intend to do a longer white paper on this hierarchy soon because I think its a useful high-level roadmap for guiding IT investment.

6:21 PM | Comments () | Recommend This | Print This

InfoPath (or XDocs)

In this InfoWorld article, Jon Udell gives the 10 things you should know about InfoPath (ne'e XDocs). There are a couple of points I thought deserved emphasis:

  • The product includes a full-blown DOM, not just a SAX API which means that you should be able to manipulate the XML, not just read it, from an outside program.
  • There's a visual XSLT tool. As Jon points out, XSLT is powerful, but difficult to use (unless you're a Prolog programmer---then its old hat).
  • InfoPath can generate a schema from an XML snippet. You may not want to use this generated schema as your final version, but its a useful way to quickly get something that's pretty good and then refine it later.

Jon claims that this is a pradigm shift and I agree. Having a tool that stores unstructured data in a semi-structured, common format and is likely to be widely used because of the links to the new Office suite is a powerful combination. Moving the vast quantities of unstructured data to an easily accessible semi-structured format will bring huge changes to the enterprise. One example is the ease with which you can replace what is now a custom application. There are companies waiting to be born to take advantage of this change.

9:21 AM | Comments () | Recommend This | Print This

February 20, 2003

Utah Sentate Computer Problems

This article from the Deseret News reports a computer failure in the Utah Senate that kept Senators from conducting business. I guess even government needs reliable computers. While I certainly think the topic of the article is fair game, I think it takes a cheap shot at Greg Johnson, the IT manager for the Senate. The article talks about "the glitch-free House," which seems like a way of rubbing salt in the wound. I think its unfair to single Greg out when the problem probably has more to do with a lack of resources than any deficiency in his knowledge or efforts.

One of the biggest IT problems facing a legislature is that you can't buy off-the-shelf software that does legislation. Every legislature is different and even if they were procedurally the same, there's only 57 of them (counting territories). So, every body does their own. In Utah, I believe even the Senate and the House use different systems. Argh! You'd think that they'd be able to agree to build a single system that meets their needs.

If I were called on to do the task (fat chance), I'd probably try something using the XML standards developed by the US House. If it were a little further along, I'd be tempted to use Office 11 as the basis. I'd bet there's real slick system that could be based on those two things that could be easily customized for different legislative bodies.

One thing you can be sure of, however---there will not be a legislative audit of this matter. They save those precious experiences for the executive branch since legislative audits are more about politics than they are about good government.

7:41 PM | Comments () | Recommend This | Print This

Lecture at Rollins eBusiness Center

I'll be giving two lectures (at 2pm and 4pm) at the Rollins Center for eBusiness at BYU on February 26th. This is part of their guest lecture series. My topic will be "Life as a CIO," I think. The center is endowed, in part, by Kevin Rollins, COO of Dell who is a BYU alum.

7:11 PM | Comments () | Recommend This | Print This

February 19, 2003

Linux Networx and Bernard Daines

In 1996, someone named Bernard Daines came to BYU, where I was teaching, and gave a talk about a brand new company he'd founded named "Packet Engines." Packet Engines made the first gigabit ethernet switch. Since I was chair of the capital equipment committee, I bought one. We had one with a very low chassis number. I didn't realize at the time, that Bernard had founded Grand Junction and sold it to Cisco in 1995. He'd later sell Packet Engines to Alcatel amid some controversy (for some interesting reading, see this article on Bernard Daines).

I've recently discovered two other interesting connections. Bernard Daines is the founder and former CEO of World Wide Packets, the company that makes the break out boxes being used by Provo City in their network. Its a cool little box about the zie of a paperback book that has a fiber connection in one side and four 100 Mb ethernet connections and two digital TV connections on the other side.

Bernard Daines is also the man behind Linux Networx, perhaps the coolest start-up in Utah at present. I had a chance to visit Linux Networx today and spend a little time with Steve Hill, the CEO. (Steve has an an interesting history in his own right.) Linux Networx puts together Linux clusters and sells them (the last part is important if you want to keep doing the first part). They built a 1150 node machine for Lawrence Berkeley National Labs (each node has 2 processors and 2 Gigs of RAM). They also build four node clusters that run Oracle. Every order is custom made on the assembly room floor in Sandy Utah. If you like Linux, big iron, or just like opening up new hardware and putting it together, this would be an incredibly cool place to work.

Seems like I keep bumping up against things Bernard Daines is involved in. I'd love to meet him again. He's got a talent for finding cool things to do and getting them done. It doesn't hurt that thanks to two successful start-ups he also has the money to be self-funding.

8:31 PM | Comments () | Recommend This | Print This

Book Review: IT Web Services by Alex Nghiem

Last week, I mentioned that I was reading a book called IT Web Services: A Roadmap for the Future by Alex Nghiem (pronounced "neem"). I've worked my way through it and have some comments.

Nghiem is the President of a consulting company called Blue Samba Solutions. The first five chapters of the book are the requisite introduction to web services. If you already have a good handle on it, you can probably just skim this or even skip it altogether. On the other hand, its well written and it managed to clear up a few cloudy issues for me. What's more, I appreciated getting Nghiem's take on some things. Chapters six through nine are the heart of the book and make it well worth the price.

Chapter Six is entitled "Web Service Networks" and takes the form of a brief introduction to the topic followed by two interviews: one with Craig Denato, the CEO of Grand Central Communications, and the second with David Spencer, the CTO and CEO of Flamenco Networks. Both companies have similar goals: fill some of the holes in web services implementations. Grand Central operates a fee-for-service value-added network for web services and Flamenco sells software that creates a P2P network through proxies installed in-between communicating partners. I found this chapter to be very interesting and informative. I had a better understanding of what these two companies do and what their business model is after reading it that I got from visiting either company's web site.

Chapter Seven is entitled "Web Service Architectural Patterns." I was really looking forward to this chapter and came away slightly disappointed. The material was good, but pretty skinny. Maybe web services just aren't mature enough yet to have developed a significant of patterns. Nghiem discusses four patterns:

  • Native web services
  • Web service proxy
  • Document-centric web services
  • Orchestration web services

I'll probably come back to this topic later in my blog and discuss Nghiem's patterns and solicit others.

Chapter Eight gives a high-level plan with analysis for adopting web services. Much of this is common sense that any good CTO would probably understand, but its still a good check list to review as you begin a web services project.

Chapter Nine discusses software as a service and doesn't really seem to belong until you read the included interview with John Alberg, the VP of Engineering for Employease, an HR ASP. John talks about how they use XML and web services to implement their service.

The rest of the book (another 100 pages) is appendices that cover ebXML, case studies, interviews with web service platform vendors, and a product review of Iona Technologies XMLBus product. A review copy of the product is included on a CD in the book.

Overall, I found the book to be informative---the interviews alone are probably worth the price. The formatting has some errors and there's some diconnectedness leading to a "thrown together in a hurry" feeling, but that doesn't really affect the ability of the book to deliver on the information. I recommend it.

2:32 PM | Comments () | Recommend This | Print This

SLC Public Library

I needed to meet with some folks in downtown Salt Lake today and didn't have immediate access to a meeting facility, so I had them meet me at newly opened public library. If you're local and haven't seen it yet, you really ought to visit---its a great facility. Plenty of meeting space, study tables, reservable conference rooms, and even some retail space for newspapers, comics, coffee, and cards. Its right across from the Salt Lake City building on 4th South.

As an aside, Jon Udell's library lookup service works for this library.

1:56 PM | Comments () | Recommend This | Print This

February 18, 2003

Public Service Tip No. 5: Avoid the 'L' Word

This story is part of a ongoing series of tips and advice for private sector people who might be considering a stint in public service. See the feature page for an index to the complete set.

This InfoWeek story is about state budget shortfalls and what that means to IT managers in the private sector. The article says that tight budgets mean that public sector IT directors face the same tough decisions that their private sector peers do. That's not true: they face worse choices. To understand why, you have to understand a fundamental principal of public management: sacrifice anything before you manage the size of your workforce.

Time and time again, I saw legislators and executive branch management clutch at any straw they could to avoid the "L" word: layoffs. Part of that is because of the rules surrounding how you can lay people off. You have to let the people with the least seniority go first---and those are typically the people you hired most recently for your newest pet projects, with the best training, and who are making significant contributions. The other part is just general disdain for the idea of workforce management. I've often said that the public sector is 20 years behind the trends in the private sector in workforce issues and this is just an example of that. My private sector views toward workforce management were some of what made my tenure at the State so rocky.

So, what does this mean to IT? Anyone who's tried to make budget cuts will tell you that its very difficult to make significant budget reductions without reducing the size of your workforce since almost all of your expenditures (outside federal pass-throughs like Medicaid) are on people. When you try, you end up cutting all capital investment, all new projects, any investment, and just hunkering down to wait out the storm. IT capital budgets happen to be one of the largest buckets of General Fund money available and so they make a tempting target. Desktops won't be upgraded, new software won't be purchased, and LANs will languish.

One of the best arguments you can make for desktop management to a CFO is that it evens out the cash flow for a pretty expensive item in the IT budget: employee desktops. That wasn't a popular argument at the State because what it meant to a State manager was that you were taking the float out of their budget. Lots of IT equipment gets bought at the end of the year after the program manager is sure that there's extra money. Evening out that cash flow would mean that there was that much less money to play with in tough times and (heaven forfend) someone might have to mention the "L" word.

Some might misunderstand me and assume that I'm arguing for a scenario where employees are sacrified to save any IT project. That is not the case. I'm arguing that good managers, whether they be in the public sector or the private, should ask "what's in the best interest of the citizen (or my shareholders)?" In my experience, that question is not ask. The question is always "what will help us avoid having to layoff employees?"

I seem to be out of step with the vast majority of public sector managers, but I believed that my first duty was to the citizens. I remember one meeting when the budget crisis first started up. I innocently raised my hand and suggested to a group of government leaders who were discussing the problem that this wasn't such a dire circumstance: we had 22,000 employees and the entire problem could be solved by laying off 200 of them. I was adament that any organization with 22,000 employees could probably find 200 who weren't making a significant contribution. I would have drawn less attention farting in church. I was told that every employee was making significant contribution and that they would all be missed. No one could spare even one. What's more, I was warned that such talk could get me in real trouble. Turns out they were right on at least one count.

9:42 PM | Comments () | Recommend This | Print This

GXA: Is Any of it Real?

Patrick Logan sent me an email in response to my GXA posts asking an important question: is any of this real? I have to admit that I started looking at GXA not because of some special insight that it was the real thing, but only with the thought that the issues being addressed are real and this was as good a place as any to start my study.

A quick search on google for "WS Routing" yields sample code, an experimental implementations and a sample client plus the specification references. So, at least with respect to routing, a cursory examination would conclude that its not real...yet. I add the "yet" because I think that even if GXA isn't the answer, something similar will have to be invented to take its place. The issues definitely are real.

I've written about application layer internetworking before and mentioned some of the companies in that space. I found another tonight: Talking Blocks. Their product looks interesting. They make no mention of the GXA specifications, but they do make mention of the problems. I think it remains to be seen whether these standards will be adopted by these nascent companies or not. Maybe Microsoft and IBM have enough clout to force the standards, but so far they don't have products that use the standards either as far as I can tell. How the market will address these issues will probably surprise us all.

8:41 PM | Comments () | Recommend This | Print This

February 17, 2003

GXA Components: WS Routing

I've been cataloging the GXA specifications and trying to provide my own roadmap to what's happening in that area. I've created an index to the articles under "Global XML Web Services Architecture." Today, the topic is the web service routing specification.

WS Routing provides an extension to the SOAP envelop for describing the route that a particular message should take. The protocol can be used to describe the ordered path from the originator of the message, through multiple intermediaries to the final destination. For example, in the following diagram, A is the originator, D is the receiver, and B and C are the intermediaries.

The WS Routing specification defines a "path" element that can be added to the header of a SOAP message in much the same way that the security elements are added in the WS Security specification. The path element can contain:

  • a "from" element for the message originator (A),
  • a "to" element for the final destination (D),
  • a "fwd" element to contain the forward message path for the intermediaries, and
  • a "rev" element to contain the reverse message path.

This example (taken from the Microsoft page describing WS Routing) shows a SOAP envelop that contains a path element describing the picture shown above.

<SOAP-ENV:Envelope
      xmlns:SOAP-ENV="http://www.w3.org/2001/06/soap-envelope">
   <SOAP-ENV:Header>
      <wsrp:path xmlns:wsrp="http://schemas.xmlsoap.org/rp/">
         <wsrp:action>http://www.im.org/chat</wsrp:action>
         <wsrp:to>soap://D.com/some/endpoint</wsrp:to>
         <wsrp:fwd>
            <wsrp:via>soap://B.com</wsrp:via>
            <wsrp:via>soap://C.com</wsrp:via>
         </wsrp:fwd>
         <wsrp:from>soap://A.com/some/endpoint</wsrp:from>
         <wsrp:id>uuid:84b9f5d0-33fb-4a81-b02b-5b760641c1d6</wsrp:id>
      </wsrp:path>
   </SOAP-ENV:Header>
   <SOAP-ENV:Body>
      ...
   </SOAP-ENV:Body>
</SOAP-ENV:Envelope>

All in all, the idea and the execution are pretty straightforward, much like the WS Security specification. There are a couple of important points

  • The SOAP message contains the route, so the message can be routed independently without any intermediary having to refer to or be in communication with the originator or any other central coordinator. Once the message is on its way, each intermediary can get it to the next recipient on its own.
  • A corollary to the last point is that the SOAP message is, as it was before the insertion of the "path" element, transport neutral. This means that each intermediary is free to choose whatever transport is available or best for getting the message to the next recipient. So, in the example given above, A could send the message to B using HTTP, B could send it to C using Jabber, and C could send it to D using SMTP.
  • The "rev" element is constructed as the message is moved along. Each node moves its "via" element from the "fwd" element to the "rev" element.
  • The example given above shows a static route, but the WS Referral specification provides a mechanism for dynamically discovering the route. I'll cover that next.
  • In keeping with the modular nature of the GXA specifications, the WS Routing specification is not concerned with security, reliability, retransmission, transactions, etc. These are handled by other components in the GXA family.

10:31 PM | Comments () | Recommend This | Print This

February 14, 2003

IT Web Services

I just picked up a copy of a book called IT Web Services: A Roadmap for the Future by Alex Nghiem (pronounced "neem"). I was anxious to read this for a few reasons. First, there are a couple of interesting looking chapters on web services networks and web services architectural patterns. Second, this is part of the Harris Kern series which has been traditionally aimed at issues in building rock-solid reliability in IT infrastructure. For example, his IT Organization book is an excellent read on building organizations to offer reliable IT services. I'll let you know what I think of this new book when I'm done.

4:37 PM | Comments () | Recommend This | Print This

February 13, 2003

Why Digital Identity Matters

Jon Udell is pointing to column he wrote in 2001 that reviews Jeremy Rifkin's book The Age of Access. In Jon's column, he gives what I think is the most succinct explanation about why digital identity matters:

Rifkin's central theme is simply stated. We are entering a new stage of capitalism. Its defining principle is no longer ownership of property bought and sold in markets, but rather access to services leased within networks of suppliers and users. As consumers, and as businesses, we spend less on one-time purchases, and more on subscriptions to a growing array of services. Many of these services are delivered through electronic networks -- electricity, Internet connectivity, online content. But as Rifkin points out, tangible things -- cars, computers, office buildings, and inventory -- are also "dematerializing" into services. Ownership of such things is becoming a liability, something to outsource.

In a property regime, commercial transactions can be (relatively) anonymous, and are of brief duration. I walked into a local computer store a few months ago, and paid $25 in cash for a PC video adapter. The proprietor might or might not recognize me on the street; might or might not have a record of that transaction; might or might not have any further contact with me. In an access regime, transactions are never anonymous, always recorded, and embedded within a long-term relationship.

One of the prices that we pay for the convenience of using services, rather than owning products, is the burden of repeated authentication. I have to identify myself, every time, to gain access. Whether I do so directly, with my own name, or indirectly, by way of a pseudonym, is a matter of architecture and policy, and will determine whether or how the slippery concept of privacy will govern my use of the service. But the fact is that some real or pseudonymous identity is a condition of access.

I once had an IT director at the state ask me why anyone cared about directories and single sign-on. They argued vociferously that we were wasting our time putting together a master directory. Their day probably involved one authentication to the Novell file server and not much else. In that world, who cares. But in the world envisioned by Rifkin, we could be authenticating more or less continuously in one form or another. I'll let my digital proxy (like my cell phone) do that for me, thank-you very much.

9:50 PM | Comments () | Recommend This | Print This

An Open Source, For Profit Project

Andre Durand and Eric Nolan reminded me that PingID and SourceID represent exactly the kind of symbiotic relationship between an open source project and for-profit company I was mentioning. Note that SourceID uses a "public source" license. (Disclaimer: I'm a PingID advisor.)

6:02 PM | Comments () | Recommend This | Print This

Knowledge Management from the Inside Out

I had another opportunity to spend some time with Cogito. Cogito's current products represent a form of content management for engineering documents (everything from schematics to detail drawings). As we were talking, I began to see them as doing knowledge management and collaboration, but from the inside out.

When a traditional collaboration company (maybe too young a field to be calling anyone "traditional" but bear with me) like Groove approaches collaboration and knowledge management, they view the archive of team data as the artifact and build a meta model of the archive in an attempt to provide an understanding of the information to the team, allow them to make better use of it, and provide a collaboration vehicle. In many cases, that archive of information is a collection of representations of some mental concept that is collectively shared by the team.

As an example, consider a Boeing 777. There are millions of engineering drawings and other documents and hundreds of databases that collectively represent the archive of information about any particular 777 design. The design process consists of engineers and others building this archive of information to represent the mental map they collectively share of what a 777 is. Now, suppose that you could build a representation of that mental map from this archive in such a way that the entire archive could be thrown out because any piece of it can be regenerated at will. What's more, any changes to the artifacts in the archive update the model and consequently any other document associated with that artifact is automatically updated as well.

This represents an alternative approach to knowledge management that I think of as "inside-out." Instead of building a model of the archive on the outside, you build a model of the concepts---the things inside the artifacts that make up the archive.

This may all sound too good to be true, but Cogito has real contracts with real companies producing these models on the scale I've described here. Their revenues are modest to date, but have a significant partner that should provide an excellent channel for selling their technology. I think that deep down, there's a connection between what Cogito is doing and what the semantic web is attempting, but two and one half hours wasn't enough time to ferret that out.

3:48 PM | Comments () | Recommend This | Print This

February 12, 2003

Business Intelligence

I've found that if you're looking for information about who's hiring, who's looking for funding, who's recently found it, and so on, you need look no further than an accountant. Both those working for large firms and those working for smaller businesses know a lot about the local business climate. They're a great source of intelligence.

10:25 PM | Comments () | Recommend This | Print This

Spicy Noodle Sub-Culture

Dave is making me hungry talking about spicy noodles. Those sound really good. I wish I knew of a place that sold them in Salt Lake. We could start a whole blogger sub-culcture around spicy noodle eating. Malouf?

11:38 AM | Comments () | Recommend This | Print This

Ethics and Fiduciary Duties

I figured that my article yesterday on Linux and IP would generate a little controversy. I was right. Here is an example of the kinds of comments I received:

I disagree. The short version of why I disagree is that if a company insists on doing things that are legal but unethical (or even immoral), the company should not be surprised and cry foul when those laws are then changed and their actions are made illegal retroactively. They will also have generated a lot of ill-will along the way.

While I agree that there are many things that are legal but still unethical, there is no bright line between the two. Consequently, each corporate officer has to make their own decisions. I've been in that position and you make them all the time. Its not always easy. People who think it is typically have the luxury of drawing a paycheck for simply producing code or operating a system. Not as much room for ethical controversies there (although there are some).

I don't agree that protecting intellectual property in and of itself is unethical. I also don't agree that using the legal or political system to gain advantages for your shareholders is, in an of itself, unethical. There are many action in each of those areas and countless others that are unethical, but that doesn't taint the whole area.

I love open source projects and have been a beneficiary of them since I started working on the Internet in the 80's. I also believe that there is significant promise in open source business models. I applaud companies like jBOSS and Jabber for exploring business models that are trying to that show open source is a viable way of creating shareholder value. I do not believe, however, that "information wants to be free" or that open source is inherently good and other models inherently evil.

Here's what would be unethical in the case of SCO: If the corporate officers of SCO, without the knowledge or approval of the board, were to simply decide, on the basis of their personal beliefs or desires, to ignore SCO's significant IP claims and open source their significant code base, they would be ignoring their fiduciary duty to their shareholders. However, if they have a real business plan that incorporates an open source strategy and board approves it, that's different story.

I don't think that the case of open source is advanced by simply labeling any attempt to protect IP claims "unethical" or "evil." That's too easy and doesn't carry much weight. What does advance it is to show people with the fiduciary responsibility to create shareholder value how they can best do that using open source. I think the juries still out on this one.

10:00 AM | Comments () | Recommend This | Print This

February 11, 2003

Corporate Sabotage

Today's Deseret News carries a story about corporate sabotage. Seems a company in American Fork fired a systems administrator and the guy took their systems down. The story is a little lame, but the business owner claims $20,000 per day in losses and says they've been down five days. This is a big problem for small businesses---maybe bigger in relative terms than it is for large businesses who have more resources.

As small businesses rely more and more on computers, they have very few resources that they can lean on to provide computer support. Companies like Direct Pointe can provide first class IT support for standardized things like file, print, and messaging, but that doesn't help with things that require more custom approaches.

7:01 PM | Comments () | Recommend This | Print This

Linux War: IP vs. Open Source

As noted in the InforWorld article, SCO has hired David Boies to represent them. Many believe that this indicates that they are going to start aggressively enforcing their IP claims in the Unix space. I wouldn't be at all surprised. As I've noted before, venture firms have themes and one unmistakable theme that you'll find in the Canopy Group is a belief in intellectual property as a competitive advantage. SCO is a Canopy Group company.

A lot of people will be upset and blame SCO for "doing the wrong thing." I don't necessarily see it that way. While, I'm not big on IP, and particularly lawsuits, as a way of creating competitive advantage, I can't blame anyone managing a company for doing anything that is legal and in the best interests of their shareholders. When I was a corporate officer, I took my obligations to the shareholders very seriously and if I were running SCO, I may be making the same moves, regardless of my personal biases. Don't get mad at SCO, go out and prove that a different model yields more shareholder value. Then you've won.

4:43 PM | Comments () | Recommend This | Print This

February 10, 2003

Volution Tech

Volution Tech is a spin-off of Center 7 and SCO. One product is called "Volution Manager" and provides remote managment capabilities to servers. For example, MacDonalds has SCO Unix servers sitting in thousands of stores around the globe that have no local IT support. Consequently they've been slow to migrate, upgrade, and install new applications in an effort to increase reliability. Volution manager allows all of those remote servers to be managed.

Another product they are selling is called "Pilot Center" and is aimed at the corporate data center and facilties management. Pilot Center is sold as a leased applicance that contains the hardware and software necessary to create the monitoring and management environment.

4:14 PM | Comments () | Recommend This | Print This

Power Innovations

Bob Mount is the CEO of a company called Power Innovations. I've known Bob for 10 years or so. His company provides power solutions for digital equipment in specialized environments such as airport baggage scanners, military vehicles, oil exploration vehicles, mobile satellite launch vehicles, industrial applications, and aviation.

Power Innovations has developed power units for resuscitation units for premature infants. These units allow the unit to be removed from utility power for up to an hour so that the baby and unit can be transported. They also give feedback on operations so that a hospital can prove that a particular unit was working.

2:38 PM | Comments () | Recommend This | Print This

MaxStream

MaxStream makes wireless networking gear, but their market isn't personal computers, but embedded devices. They build wireless modems in the 900MHz and 2.4GHz bands for use in weather stations, electric and gas meters, monitoring remote conditions in mobile and fixed applications, vending machines, point of sales devices, HVAC, gas lines, and so on.

Why not 802.11? I asked that question and the answer comes down to three things:

  1. Overhead - 802.11 implements and entire networking stack and lots of embedded devices don't need it.
  2. Range - There's an inverse relationship between range and bandwidth. Most embedded applications need range more than they need bandwidth.
  3. Cost - These devices in bulk need to be very cheap to be embedded in other devices.

On the other hand, why not X10 or something like it? It comes down to the advantage of digital signals over analog signals. MaxStream is digital and so provides packetization, retries, encryptions, and so on. X10 doesn't do that.

2:04 PM | Comments () | Recommend This | Print This

Helius

Helius started out years ago building satellite interfaces into routers. The idea was to use the satellite for content distribution. They claim that its better, for certain applications, than either a standard satellite video system or terrestrial IP (i.e. Internet). In the case os satellite video, they have the advantage of having full IP, so they get VoIP, interactivity, and so on. In the case of terrestrial IP, they eliminate all the routers that would sit between the source and sink as well as getting multicast capabilities (which the Internet has failed to deploy).

Helius sells appliances. They have routers, video encoders and decoders (all IP based) and training systems. Their technology, however, is all about the software that lives on those boxes. As you can imagine, a lot of this involves digital rights management so that a company distributing training, for example, can determine who has paid for the training and authorize just those people to access the content.

Customers typically do training (corporate or otherwise) or distribute content to many locations (think of product advertisements inside retail establishments). They haven't graduated into the kind of aggressive personalization that was showcased in Minority Report, but its certainly possible, since this is all IP based, to tie it into the store's customer database.

11:46 AM | Comments () | Recommend This | Print This

Veloxa

Veloxa is a subsidiary of eBiz Enterprises. Bruce Parsons is the President and CEO. Veloxa provides reconfigurable computing solutions based on FPGA technology. Second company in as many weeks, I've run into in this space.

Veloxa is creating tools that compile C/C++ into FPGA cores. Veloxa provides application specific cores that are pre-developed as well as development tools for creating customer specific applications.

Veloxa is targeting the seismic data processing in the oil exploration space, rendering in the entertainment space, defense and intelligence applications, genomic applications in the biotech arena.

Veloxa believes that their competitive advantage is in the application management infrastructure that they've developed (based on JXTA) that allows them to manage clusters of FPGA running multiple jobs, queuing data and cores, finding free nodes, and so on. Think of it as Tivoli for Linux clusters with application-specific FPGA-based co-processors. The management software takes requests that include data sources, data sinks, and processing needs and schedules the right core and data at an available node.

Veloxa has estimated that application-specific FPGAs could replace 30% of the current high performance computing market of $8.5 billion.

11:06 AM | Comments () | Recommend This | Print This

Cogito

Next up is Cogito, a company that is focused on knowledge management. I first met Dallas Noyes, the founder, in 1998 when I was CTO at iMALL. The technology and the business plan are, obviously, much more mature now.

Their clients have primarily been military and security based. In the first phase, they've been primarily working with end users. Boeing has been a big customer. Their strategy is to move their product into OEM vendors of software tools that those end users employ to do their work. This market is called "product lifecycle management," or PLM, but you may know it better as "computer aided design."

In the PLM application, the technology allows one to build a database of parts and information about how they're used (configurations) and then the software will generate schematic drawings. Think of it as content management for engineering drawings.

The underlying technology differs from something like Autonomy in that Autonomy and others are building metamaps of existing archives to help find relationships in those archives, but at the end of the day, they are finding information. Cogito, builds conceptual models of the underlying archives with the goal of replacing those archives by being able to regenerate the useful features of those archives at will from the conceptual model.

I have to admit that when I first talked to Dallas in 1998, I didn't really understand what he was talking about. Part of the problem was that I was very focused on developing an ASP-model eCommerce system and I couldn't see a fit, so I didn't try that hard. Since then I've expanded my views and interests in IT and this time the meeting made perfect sense.

9:47 AM | Comments () | Recommend This | Print This

iArchives

Today I'm spending the day at the Canopy Group's Banker's Summit. The Canopy Group is a private venture group backed by Ray Noorda, the driving force behind Novell in its hey day. The day is basically a back to back series of presentations by some of the Canopy companies. First up is iArchives, a company that makes analog documents (like paper and microfilm) fully searchable. Russ Wilding is the President of iArchives and an old friend. Our wives went to high school together, so I've known him long before there was any high-tech connection.

iArchives is fundamentally a company built on character recognition technology. They have contracts with some major newspapers to digitize old papers and make them available on-line. The character recognition is used to create indexes for searching, not to deliver text to the user. The user sees the original image.

Interestingly enough, there is a eGovernment connection with what they do. One of their clients is a firearms manufacturer who wants to make their records available electronically to the ATF. The IT process that places like the ATF use to track firearms would scare you. This kind of technology could help.

9:11 AM | Comments () | Recommend This | Print This

February 8, 2003

Linux Networx

I had the opportunity to meet and speak to Steve Hill, the President of Linux Networx yesterday. Linux Networx builds and sells Linux clusters. They have an impressive client list including the NSA, Los Alamos, and Lawrence Berkeley National Labs. When my students toured the Center 7 data center last year, all they could talk about what the Linux Networx cluster that was being staged and built there. I'm hoping to get out and visit with them soon and find out more about what they're doing.

4:55 PM | Comments () | Recommend This | Print This

February 6, 2003

Cultural Archetypes

I had to opportunity to meet with Paul Losee this afternoon. Paul is one of the founders of Iomega, along with Dave Bailey and Rod Linton. I've known Rod and Dave for some time and had looked forward to meeting Paul. Paul talked to me about "cultural archetypes," a method of finding the message about a product that will really resonate with buyers. Its the method that turned Iomega from a company that sold "removable, high-capacity disk drives" to a company that sold "a place to put your stuff" with a resulting 100-fold increase in their stock price. Quite an interesting story.

8:15 PM | Comments () | Recommend This | Print This

Too Many Patches

Bruce Schneier, well known security expert and CTO of Counterpane Security, has a letter in the New York Times about the dilemma faced by CIO who run large numbers of Microsoft machines: there are too many patches and they can't be installed automatically because they often break, and yet if you don't, you're vulnerable to worms like Slammer.

I was having lunch this week with the CIO of a company you've all heard of. He's responsible for thousands of machines and they've had a policy of selectively installing patches after testing them for compatibility and effectiveness (i.e. doing Microsoft's QA work for them). Slammer hit them hard, in a matter of minutes. Now he's rethinking that and wondering if its not better to automatically install the patches and live with the clean-up problems that will inevitably result. This isn't some theoretical discussion. Its critical to the enterprise. If Slammer had hit at the end of a quarter, it would have had devastating consequences to sales at many companies. Yet, the cost of patching is inordinately high. No good choices here.

8:46 AM | Comments () | Recommend This | Print This

February 5, 2003

Two More Entries in the Utah Blogroll

Add Brian Sweeting, who works for Novell, and Jeff (Brown|Holmes|Young), from my class last semester, to the Utah Blogroll.

9:14 PM | Comments () | Recommend This | Print This

TBL on the Semantic Web

I ran across a set of slides by Tim Berners-Lee on what's happening at the Laboratory for Computer Science on the semantic web project. Interesting stuff. Reminded me that I probably need to dig deeper into this. There some interesting parallels between this and thoughts on application layer internetworking. In particular, see the slide on the application integration hub. There is also a interesting research wave front slide that gives a good idea of where interesting open problems are likely to be.

3:13 PM | Comments () | Recommend This | Print This

Web Services Interoperability Organization

I've been writing about web service specifications for interoperability. Its probably a good idea to say something about the organization behind this, the WS-I. WS-I includes Microsoft and IBM, two long time proponents of web services standards. Like most such bodies, the issues are more about politics than anything else. A recent ZD Net article stated:

...the WS-I has been better known for various political squabbles than for technical leadership. A high-profile spat between Sun Microsystems and its founding members has generated most of the attention for the group. After initially being shut out by founding companies including IBM, Microsoft and BEA Systems, Sun subsequently joined the organization.

The WS-I is focusing on security at the moment and that will be the focus of their meeting in Salt Lake City in March. If I could figure out a way to get invited, I could blog it. :-)

8:36 AM | Comments () | Recommend This | Print This

February 4, 2003

Another State CIO Weblog

I just discovered (via Technorati) that Rock Regan; friend, CIO of the State of Conn. and past president of NASCIO; is writing a blog. Its on my news feed now. Go Rock!

6:06 PM | Comments () | Recommend This | Print This

Wi-Fi Security Feature

My picture was on the front page of the Daily Herald today. Second time in the same number of months I've been on the front page of a paper. Its a little disconcerting to pull up to the gas station and see yourself in the newspaper machines. The subject of the article was Wi-Fi security and used the picture shown here of me with a pringles can antenna. In a sidebar to the piece (which doesn't appear in the electronic version that I can see) I offered the following bits of advice on Wi-Fi security:

  • Buy higher-end equipment. The cheapest wireless instruments provide only 40-bit encryption. 128-bit encryption at least takes a little longer to break.
  • For better protection, put the wireless devices in an untrusted portion of the network and use a virtual private network to access internal network resources. This provides a secure tunnel between the wireless device and the trusted portion of the network.
  • Stay out in front of your employees. Install wireless before the Jimmy in accounting does it for you---poorly.

Clearly, this just scratches the surface. I've written here before about Wi-Fi. I'm working on a more comprehensive white paper on the topic that I hope to have done soon.

9:54 AM | Comments () | Recommend This | Print This

February 3, 2003

GXA Components: Security Example

This morning I wrote about The GXA security specifications. I took some time this evening to read through the the specification and thought and example might be helpful. This example is quoted from the specification:

(001) <?xml version="1.0" encoding="utf-8"?>
(002)    <S:Envelope 
                 xmlns:S="http://www.w3.org/2001/12/soap-envelope" 
                 xmlns:ds="http://www.w3.org/2000/09/xmldsig#"> 
(003)      <S:Header>
(004)       <wsse:Security 
                     xmlns:wsse="http://schemas.xmlsoap.org/ws/2002/xx/secext"> 
(005)         <wsse:UsernameToken wsu:Id="MyID"> 
(006)          <wsse:Username>Zoe</wsse:Username>
(007)          <wsse:Nonce>FKJh...</wsse:Nonce> 
(008)          <wsu:Created>2001-10-13T09:00:00Z</wsu:Created> 
(009)         </wsse:UsernameToken> 
(010)         <ds:Signature> 
(011)           <ds:SignedInfo> 
(012)              <ds:CanonicalizationMethod 
                             Algorithm= 
                               "http://www.w3.org/2001/10/xml-exc-c14n#"/> 
(013)             <ds:SignatureMethod 
                             Algorithm= 
                               "http://www.w3.org/2000/09/xmldsig#hmac-sha1"/> 
(014)             <ds:Reference URI="#MsgBody"> 
(015)             <ds:DigestMethod 
                            Algorithm= 
                              "http://www.w3.org/2000/09/xmldsig#sha1"/> 
(016)             <ds:DigestValue>LyLsF0Pi4wPU...</ds:DigestValue> 
(017)           </ds:Reference> 
(018)         </ds:SignedInfo> 
(019)         <ds:SignatureValue>DJbchm5gK...</ds:SignatureValue> 
(020)         <ds:KeyInfo> 
(021)           <wsse:SecurityTokenReference> 
(022)           <wsse:Reference URI="#MyID"/> 
(023)           </wsse:SecurityTokenReference> 
(024)         </ds:KeyInfo> 
(025)      </ds:Signature> 
(026)    </wsse:Security>
(027)  </S:Header> 
(028)  <S:Body wsu:Id="MsgBody"> 
(029)   <tru:StockSymbol 
                xmlns:tru="http://fabrikam123.com/payloads">QQQ
           </tru:StockSymbol>
(030)  </S:Body>
(031) </S:Envelope>

There are a few things to remember as you look at the specification:

  1. The SOAP envelop has been extended to accomodate the security portions.
  2. The security standard makes use of XML Signature specification (ds namespace).
  3. The signature has to reference other elements of the message (e.g. what part the signature applies to) and uses the ID attribute in the wsu namespace to do this.

Deconstructing this example is fairly straightforward. The SOAP envelop header contains a single element: <wsse:Security...> which contains the UsernameToken and the digital signature information. The signature contains information about how the signature was computed, the reference to the message body (to indicate what portion the signature applies to), and the signature itself. Notice that the specification doesn't specify the method, it just allows it to be referenced so that both ends know what to do. If a better algorithm comes along next year, it can be used without any fuss. The final portion is the actual body of the message which, in this case, contains a stock symbol.

There's obviously much more to the spec than this simple example, but if you understand what's going on here, the rest is just options, alternatives, and details. Encrytion would be the similar, except it would reference the XML Excryption specification and some of the details would change. And, of course, the SOAP body would be gobbledygook.

10:30 PM | Comments () | Recommend This | Print This

A Utah Fund of Funds

House Bill 240, currently being considered by the Utah Legislature would set up a non-profit corporation that is authorized to raise up to $100 million dollars and then disburse that money to existing or new venture capital firms with a Utah presence. Those VCs would be encouraged through various means to invest the money in either Utah firms or firms that create Utah jobs. How do you raise the $100 million in the first place?

With contingent tax credits. The non-profit, called a fund of funds would guarantee its investors a certain rate of return (probably equal to the current money market rate). If the fund did not perform as promised over the life of the fund (usually more than 10 years), the investors could recoup their investment through tax credits. These kinds of bills have proven successful in places like Oklahoma and Arizona in creating new sources of venture money.

Industry statistics from Arizona shows that their economy receives $6.54 revenue return over a five-year period for every $1.00 of venture capital investment. In addition, for every $1 million in venture capital, 27.6 jobs are created in the state. Utah's young companies have seen available venture capital disappear, falling from a high of $330 million invested in Utah firms in the second quarter of 2000, to a low of $4 million in the third quarter of 2002

Its not just the total dollar amount available for investment in a geographic region thats important, its also the number of venture capital firms. Each firm has its own flavor or theme. For example, I've talked to VCs in the last month who say "I won't invest in services." and have lunch the next week with one who says "Services are where its at. That's all I invest in."

If you are involved in high-tech in Utah, would like a high-tech job in Utah, or simply care about it, I urge you to write to your representative and your senator1 and explain to them why Utah needs more investment in high-tech and ask them to support HB240. Make sure you tell them that you are a constituent in their district or that you own a business in their district. That carries a lot of weight. Written or faxed constiuent communication is taken as being much more important than email.

  1. The Utah Senate doesn't have an interactive map to find your district. You may have to visit your county web site (you can get to it from utah.gov) or call the Senate and ask them (801-538-1035). I could explain to you why the House has a nice map and the Senate doesn't and how they could both have really great ones for free, but I'm trying really hard not to piss anyone off this week.

8:33 PM | Comments () | Recommend This | Print This

Lucene

Unfortunately, I don't get to program much anymore, but I find that when you're managing a bunch of programmers, its frequently nice to be able to call their bluff. Consequently, I try to read magazines like the Java Developer's Journal to keep up with things and it usually pays off. I find something almost every issue that I'm glad to know about. In the December 2002 issue, for example, I found out about Lucene.

Lucene is an open source, text indexing and search tool written in Java. Its not unusual anymore to want search capabilities in an application and Lucene provides one way to do that. The web site claims that is "high-performance" and offers a a page for users to submit benchmarking results for review. No idea, of course, how these compare to other search products from Verity, for example. If you've used Lucene in a project and would like to comment on either its reliability, ease of use, or performance, please feel free to drop me a note.

5:32 PM | Comments () | Recommend This | Print This

Flying Without ID

Jeremy Zawodny posts a link about Flying Without ID. Now, I fly without ID all the time...in my plane. Some people are surprised when they learn that you're pretty much free to fly wherever you want whenever you want. The only exception is controlled and restricted airspace, but that's not usually a problem. The link Jeremy posts is about flying without ID on a commercial airline. I think this is interesting, but not really something that gets me all fired up. After all, of all the restrictions that Delta puts on me in order to transport me from point A to point B, showing them my ID is one of the least onerous. I'd rather not have to show them my credit card, for example.

4:54 PM | Comments () | Recommend This | Print This

GXA Components: Security

I've been writing about the Global XML Web Services Architecture, a set of specifications that sit on top of SOAP and provide interoperability in a number of important areas. This article is a look at WS-Security, the GXZ security specification. These articles and associated resources are being indexed in my featured papers outline.

The OASIS web services security specification creates a set of extensions to SOAP messages that can be used to secure messages and ensure their integrity. Note that this is message-level security, not secured channels (which you could do with SOAP using HTTPS as the transport, for example). Message-level security is important whenever intermediaries are part of a conversation. These extensions are referred to as the "web services security core language" or WSS-Core. The specification was put together by Microsoft, IBM, Verisign, and others.

IBM and Microsoft have released a white paper which describes the specification. The specification extends SOAP so that the following processes can take place (quoting from the MS/IBM white paper):

  • A Web service can require that an incoming message prove a set of claims (e.g., name, key, permission, capability, etc.). If a message arrives without having the required claims, the service may ignore or reject the message. We refer to the set of required claims and related information as policy.
  • A requester can send messages with proof of the required claims by associating security tokens with the messages. Thus, messages both demand a specific action and prove that their sender has the claim to demand the action.
  • When a requester does not have the required claims, the requester or someone on its behalf can try to obtain the necessary claims by contacting other Web services. These other Web services, which we refer to as security token services, may in turn require their own set of claims. Security token services broker trust between different trust domains by issuing security tokens.

The paper gives a number of scenarios showing the different ways that the specification can be applied. I think its particularly important that the specification is built to accommodate existing security models rather than foisting another one upon the enterprise. This was the big mistake in 802.11a/b: the security wasn't built on existing, socially proven concepts, but something completely new that, as it turns out, didn't work.

The WS-Security specification is not the end, but the foundation for a number of other security related specifications which will build on the message-level security in the areas of security policy, trust, secure conversations, privacy, authorization, federation, and others. White papers in the areas of trust, policy, and secure conversations were released in December 2002.

11:39 AM | Comments () | Recommend This | Print This

February 1, 2003

PsyncX for Backing Up OS X: A Review

Yesterday, I published the brief results of my short survey of OS X backup options. I said that I'd be hesitant to try the open source PsyncX without additional information. The problem isn't that I don't like trying untested software, I just don't like trying untested software that mucks around with my file system when that same system is not backed up! Well, Ted Hughes wrote to me and told me of his experience with PsyncX, so I decided to try it. Here's what I found.

I chose to backup my user directory to a SMB mounted disk from another machine.

  • PsyncX is a fairly simple program that mirrors a target to a source.
  • You can't select multiple disks. Since my email, my documents, and my Radio files are all in separate places, I could only backup one at a time.
  • The good thing about a mirror is that it is immediately accessible and its easy to see if what you wanted is there.
  • The downside is that its unencrypted, uncompressed, and you don't get incrementals. That's not to say that PsyncX copies files, even if they're unchanged---it doesn't---but that you can't revert back to the way a file was 3 days ago. The changes will just get written out and overwrite the old file.
  • PsyncX is really just a GUI on a script that gets called from cron. Once its set, you can't edit the cron job from within PsyncX. I haven't tested whether you can create multiple PsyncX cron jobs using the GUI (to back up different targets). You could certainly create them by hand.

So, it works and is simple to use. Its not sophisticated by any means and I'd like to see more control over targets and destinations, along with the ability to edit scheduled jobs. Still, its free and my data is now a lot safer than it was this morning.

8:04 PM | Comments () | Recommend This | Print This