Once again into the breach, dear friends. I’ve made it a habit over the last several years to put together a list of both my own forecasts for the upcoming year in the realm of XML technologies and technology in general and, with some misgivings, to see how far off the mark I was from the LAST time that I made such a list. The exercise is useful, and should be a part of any analyst’s toolkit, because it forces you to look both at the trends and at potential disruptors to trends (which I’ve come to realize are also trends, albeit considerably more difficult to spot).
So, without further ado, I bring up the list from last year …
How Close I Came:
This was something of a no brainer, though I have to admit that things haven’t turned out quite as expected. Microsoft had to rebrand XAML/WPF as Silverlight, and by all indications even with the brand change its not exactly taken off anywhere near as well as expected. The Adobe Mars framework similarly is taking off in fits and starts, as people begin to realize that complex mixed XML and scripting interfaces are a lot harder than anyone anticipated. In many ways, the real surprise story here has been the rising strength of HTML as a framework, and the emergence of HTML 5 (including Tim Berners-Lee’s announcement that HTML was essentially being reopened in the spring. By all indications what I see coming out of this is that people are generally not jumping wholesale into XML frameworks, but are instead making HTML and XHTML work better for them via AJAX. I’m going to keep the binding story in the books for 2008, but I’m beginning to think that XBL has missed the boat, and those bindings are increasingly being assigned via pure AJAX calls, rather than indirectly via CSS against XML bindings.
XForms Adoption. I’ve been working heavily with the XForms extension from Mozilla, and have also been in communication with XForms vendors throughout the industry - my original predictions for XForms adoption in late 2007 seem to be holding true. Again, here is a compelling technology that has a lot of use, makes for a powerful programming model, and works incredibly well with both AJAX type data feeds and XQuery based services. I hope to have a report on a case study that I’ve had a direct hand in producing later this month that covers exactly what can be accomplished with XForms technology, but for now suffice it to say that I think XForms will likely start appearing on IT managers’ radars towards the latter half of this year.
How Close I Came:
I’ll claim this one. XML 2007 incorporated an XForms evening, and XForms citations in the media have increased dramatically, especially in the last few months. The Mozilla XForms plugin is actually reaching a fairly useful state, Orbeon and FormsPlayer continues making headway (along with the surprise story for 2007 coming from Picoforms), and the combination of XForms and eXist is becoming one of the hottest combo stories (more on that in my 2008 predictions).
XQuery.2007 will be the year for XQuery. I’ve written two books on the subject, and overall I’d begun to despair at the LONG timeline but if the W3C stays true to form, the XQuery 1.0 specification will be ratified in February 2007. I’ve been working with XQuery and the Open Source eXist database, and have discovered something that I hadn’t really considered before .. as a query language, XQuery is reasonably useful - but with a few fairly minor extensions XQuery can be used as the foundation for a very powerful server language in its own right, especially given that invoking XSLT transformations take a single line of code and can be processed inline (this is REALLY useful with Saxon 8 as the XSLT processor, because it marries XSLT 2.0 technology with XQuery 1.0 technology (both of which use XPath 2.0).
How Close I Came:
I may have been a little premature on this one, but not by much. XQuery is becoming big news - you’re seeing it popping up in all kinds of places, from BEA’s SOA offerings to Oracle 10i and IBM db2 to SQL Server. A significant bulk of my programming work this year has been in developing XQuery based XML data systems, and I fully anticipate that this trend will only accelerate. If you are a heavy XML developer, you absolutely must add XQuery to your repertoire, because I see it becoming an expected part of your incoming programming skills.
3D Goes Local. Consumer-level 3D technologies wax and wane with surprising regularity, but I think there is more than a little evidence to suggest that we’re on the waxing side of the curve again. Vista’s 3D architecture, Linux’s Xgl, the Java Looking-Glass technologies, and so forth all bring 3D technology to the desktop at both a kernel level and fairly high in the application stack. Online games such as Sims 2 and Second Life have served to bring in millions of people into the 3D development fold (sometimes at the skins level, sometimes at the mesh) and there is now increasing interest in the X3D standard, based upon the older VRML specification, as both desktop and laptop capabilities now contain the necessary horsepower to display 3D architectures in realtime. Similarly applications such as Poser and Bryce have raised high render 3D into the art realm.
How Close I Came:
World of Warcraft, the Sims, Second Life … yup - One of the most fascinating things I’ve seen of late are the marriage of sims animations in YouTube videos, to the point where user-generated animated 3D movies are becoming surprisingly mainstream. X3D continues to languish, though I’ve been watching with some interest the development of a few 3D canvas extensions for Firefox that makes me think that again it is going to be in the HTML browser space that things are likely to heat up. Another surprise (though it shouldn’t have been), and something that I’m beginning to think will become the norm, is that OpenGL has become the de facto 3D standard, rather than an XML format like X3D. I’m scaling back my predictions on X3D development here; its just not appearing on my radar often enough for me to feel a lot of confidence in its continued success.
The Fallout from Vista. Vista is large, ambitious, and has tied up the resources of Microsoft in one last great orgy of software production. The question now is whether or not the effort was worth it. I’m predicting that it won’t have been.
How Close I Came:
Yikes. Apparently I wasn’t the only one who had their trepidations. Its a bad sign when your customers are buying (or bootlegging) copies of your previous system to put on machines with Vista pre-installed. Vista has not been the great success that Microsoft had been hoping - with problems ranging from digital rights management to incompatibilities in driver support to inferior performance (especially with Aero installed) - and there have been more than a few customer defections (primarily to Apple, it seems) as people have become fed up with Microsoft’s hype being so much bigger than their products. I think that this year will see some amelioration for that - a new service pack, considerable loosening of security restrictions, some retooling and rethinking on a number of critical pieces of technology within Vista - all will likely strengthen the platform considerably. The question in my mind is whether Microsoft has reacted quickly enough here, or whether the exodus that’s begun will only continue.
Year of Enterprise AJAX. Several pieces are now aligning to ignite AJAX on the Enterprise. The toolsets are moving out of the “Hey, I just published a set of quick and dirty tools on my site” to fairly rich and comprehensive sets of components that are reasonably standardized, formal methodologies for working with AJAX are being defined making it easier for project managers and business analysts to estimate the viability of these technologies in their development effort and independent software development houses and consultants are now coming online with real success stories. This is already having a significant shift on the way that applications are developed, raising the “client architect” to a position roughly comparable to that of the senior DBA and shifting the focus of applications away from delivery of single pages of content to the delivery of multiple streams of partially or raw data. I expect that 2007 will see a fairly radical shift as companies re-evaluate their current delivery mechanisms and server-side solutions in favor of more RIA ones.
How Close I Came:
I’ll claim this one as well, though there’s been some interesting developments in this space this year - at heart you have the increasing impedence mismatch between SOAP/WSDL based SOA on one hand and the lighter-weight AJAX frameworks on the other. I’m increasingly getting the sense that one of the key drivers in all of this over the course of the next year will be XQuery - both because as a language XQuery is more effective than SQL at providing the relevant glue between the two worlds. Ruby on Rails may prove to be the other avenue, but my sense is that Ruby is beginning to run into serious headwinds of its own - I’m seeing more stories of Ruby-based projects that are just not up to the complexity of enterprise apps, less an indictment of the language (which I like) but more because its moving beyond its core developer base into the mainstream where developers are not as familiar with its design methodologies.
Going Atomic: REST Comes of Age. I see as an adjunct to this an increasing hybridization between SOAP oriented web services and more REST-like ones, as the strengths and weaknesses of each approach have become known. My long term forecast for SOAP is that it is likely an interrim technology - useful in situations where you want to treat the server as being a clearly defined component within a larger scale integrated network, but increasingly being pushed into the background in favor of plain-old-XML (POX) and simple publishing APIs. Atom and the Atom Publishing Protocol, APP, are likely to factor big in this - syndication oriented feeds have an enormous degree of utility, and I suspect that in the long term the APP will spell the difference in success between Atom and RSS 2.0. APP is about as REST oriented as you can get, with a simplified and standardized API for publishing in a clearly defined manner, and with the realization that such publishing does not just have to be for HTML news feeds. Expect new formats in this arena (such as my friend and colleague, M. David Peterson’s LLUP protocol) or the Google Calendaring API (which is Atom based) to become de facto standards in this arena.
How Close I Came:
I think the trend is clear here, though whether its become fully mainstream yet is still debatable. REST Web Services by Leonard Richardson and Sam Ruby, has become one of the hottest books in Tech land this year, and represented a significant clarion call to return to the roots of REST-based programming on the Internet. Tim Bray released an AtomPub Apache mod, making it possible to support Atom-based publishing through the Apache system, and Atom is very quickly finding its way into a new Syndication Oriented Architecture (SynOA) as the default messaging language. More on this in the 2008 predictions.
Media Gets Tubed If I was a junior-level exec at Warner Brothers, or Paramount, or Disney, or any other major studio, I’d likely not be sleeping very much at night anymore. The deconstruction of the industry is underway, and its about to go into overdrive - and that cush seven figure income I’ve been making is likely to go the way of the dodo bird. Put a video camera into somebody’s hands, and they quickly become a director - whether documenting the injustices of oil company dealings with native indigens (as one recent Brazilian tribe did), making the next version of Star Trek (which a group of Trekkies did recently, creating a very credible and fairly engaging story within that universe) or creating their own “series” that often end up with viewerships comparable to those from television. Provide a forum like YouTube or MySpace to host this content, without the layers of censoring and competitive pressures, and all of a sudden the whole foundation trembles.
How Close I Came:
Two words: writer’s strike. This has been a BAD year for Big Media. YouTube became a big target for just about every major studio, and YouTube was forced to retrench with regard to some of its policies, but overall efforts on the parts of NBC, Fox and others to launch their own YouTube-like services have been less than successful (actually, the words abject failures comes to mind). The Writer’s Strike represented perhaps the final stake in the heart, as it has effectively destroyed both the Fall and Spring viewing seasons and has also shifted an increasing number of people to the various digital video services. In the end any settlement that the writers make may end up proving pyhrric - the studios are losing control of the production process fast, and the result is likely to be the rise of a whole new set of smaller distributed companies that bypass the traditional distribution channels altogether.
Demise of Microformats Summary: In 2007, microformats would start to fade.
How Close I Came:
Okay, this one I blew. Microformats, while still not exactly mainstream news, have been picked up as a quick and dirty way to add semantics into page markup, and are increasingly being targeted in another additional applications such as Google GData feeds; moreover, there’s some significant changes taking place as microformats meets the Semantic Web (see 2008 predictions).
Where the Tech Players are Summary: In 2007, Google up, Yahoo and Microsoft down, Mozilla holding its own but barely - look for a shakeup in the social network space, Opera ascendant.
How Close I Came:
Pretty much on the mark. Google dominated the news this year, though like pro-sports players that doesn’t necessarily bode well for next year. Microsoft did well in the XBox market (as predicted) but Vista was definitely off, and the recent setbacks in everything from the EU’s lawsuits to the failure of OOXML to become an ISO standard to Opera’s recent lawsuit concerning standards compliance has definitely kept their legal and standards departments hopping. Yahoo’s becoming increasingly irrelevant, and is supposedly quietly seeking suitors in a considerably reduced position, but with the seizure of the credit markets over the summer, they may be in trouble. AOL is clearly in a tailspin now - the unit is still profitable, but its faced significant layoffs and has a considerably diminished market share even compared to this time last year.
MySpace has gone from being the darling of the media to an also ran, while Facebook has both risen and is showing signs of falling just in the last year. I think that the social networks space in general is primed for a shakeout (more shortly).
Cloudbursts and Hurricanes:I predicted another bad hurricane season, but that didn’t come to pass, however enough nasty weather did hit both the US and Canada that there’s become very little doubt that global warming is real. Still, I should not be thinking about a career as a meteorologist. The summer passed with surprisingly mild weather in the Gulf of Mexico, though that’s been one of the few quiet spots. Much of the US underwent a drought in 2007, while the West Coast was pounded by storms through much of the year. It was yet again globally one of the hottest years on record, and the US has been effectively isolated for its stance with respect to global warming.
All in all, not a bad record, though it shows that I should probably play it safe a little closer to home. It does give me a chance to recall which trends I saw as important in late 2006, which still have legs, and which have played themselves out.
Now comes the hard part - getting up on the high wire again to see if, once again, I can guess the future while blindfolded. I’ve been a little closer to the trenches the last year, which has been good from the standpoint of letting me reinvigorate my knowledge base, but at the cost of perhaps missing some of the more critical trends. Thus, as per usual, your mileage may vary.
What I See in 2008
My predictions for 2008 are somewhat darker than those for 2007, mostly because I see some very worrying trends coming from a lot of different directions, and I see all of these things making the field considerably more confusing and volatile than the were in 2007.
Vertical XML Standards become pervasive. A number of large industry consortia, such as the XBRL consortia and the HL7 consortia, have generally reached a stage where the technology is adequate to support the standards, public acceptance of the standard has reached a solid comfort point, test and pilot projects have completed successfully, and so on. This means that these standards are now being deployed in limited commercial trials or in some cases and voluntary program usage (such as the announcement made recently that the Securities and Exchange Commission will be accepting XBRL as a suitable format for making business filings), with the possibility that such a submission may become mandatory within a handful of years.
Chances are that if you work for a company of a certain size, some part of your organization will have to figure out how to work with this technology, and if you are the XML go-to guy in your organization, that someone will likely be you - even if up to now your mandates did not specifically address those concerns.
My personal feeling is that beyond increasing transparency within organizations and making it possible to build more sophisticated applications rapidly and securely, such standards will likely make XML increasingly visible in portions of your organization where it hadn’t even appeared before, and as such I suspect will hasten the adoption of XML within the enterprise sphere for other ventures (such as documentation publishing, forms management, accounts payable and receivable, and even areas like advertising and marketing.
Concerning XBRL (the XML Business Reporting Language) in particular, if you aren’t familiar with it as either an XML dev in your company or as an independent developer, its an area where you should concentrate some time for study. Not only is this shifting the way that we deal with keeping track of the way that businesses operate, its also having a significant impact on new XML fields, such as the rise of programmatic ontologists.
Companies/Projects to Watch: There are a lot of smaller companies in this space that are well positioned to take advantage of a significant rise in XBRL adoption. My two favorites at this point are both Cascadia companies - UBMatrix, out of Kirkland Washington, and JustSystems, a Japanese company with a significant Vancouver subsidiary. UBMatrix’s founders include Charles Hoffman, who was instrumental in the development of the XBRL specification, and their focus has long been on getting this particular specification commercialized.
JustSystems is something of an odd company that I first encountered a couple of years ago at the XML 2005 conference. With a significant presence in Japan, they’ve developed a number of products that are built around the W3C specifications to a degree that’s pretty startling compared to many western companies, but they’ve remained largely a Japanese phenomenon until their fairly recent acquisition of XMetal. Diane Mueller, a vice president with JustSystems, was also the chair of the most recent (2007) XBRL conference in Vancouver and chair of the Rendering Working Group, and has been a long-time proponent of XBRL.
Overall, I think both of these companies should be watched carefully, as they seem to have both the XML expertise to build useful tools and the XBRL conceptual chops to build more sophisticated XBRL-sensitive applications, and given the rise I see in XBR usage and adoption, I see either of them as either market leaders or potential buyout candidates at the right time. Similarly, for information about XBRL in general you can keep abreast of news and standards at the XBRL Consortium Website.
The New XML Professional. Until fairly recently, if you were involved in XML at all within your organization, it was likely that you were either a content management specialist or you were a web developer (which is a web-dedicated content management specialist). Outside of that, if you consciously worked with XML at all, it was likely in a very peripheral capacity (building configuration files in Ant, for instance). That is beginning to change dramatically. There are a number of new job titles that are emerging in the programming field that I see becoming commonplace in the year ahead:
- Information Architects/Ontologists. An ontologist is essentially responsible for creating data models within XML, designing schemas and taxonomies, establishing standards and in general defining the business instruments to be used within an organization or business.
- XML Database Administrator. This particular position is the XML analog to a relational database administrator (or DBA), and is the one responsible for working with and maintaining XML database systems, for setting up query interfaces between relational and XML systems, and for managing the relevant accounts and access systems. The primary differences between an XDBA and SDBA come about because of the different schematic focuses and skill levels, though the job they do is similar.
- XML Database Analyst/XQuery Specialist. This particular position is responsible for setting up the more complex XQueries in a given data store, whether XML-based or not. Especially given the rise of web-services-facing-XQuery (such as that used by applications like eXist, MarkLogic and so forth) this role is more akin to a web developer than a DBA, though its likely that the analyst may work closely with both ontologists and XDBAs.
- XML User Interface Specialists. This position is occupied by client-oriented developers working within the context of XML frameworks such as XForms, Silverlight, Mars/Flex, XUL, OpenLaszlo and so forth, as well as working with AJAX based systems. My suspicion is that as XQuery becomes more heavily established on the server, so too will XForms become more heavily entrenched on the client (the two are remarkably complementary) and that XForms developers in particular should be in particularly high demand, especially towards the middle to end of 2008.
- XML Transformational Specialist. These specialists are the ones proficient in XSLT or in translational systems, and their role increasingly will be the one responsible for moving various XML data streams into the appropriate format at the appropriate time. This may also be a shared responsibility with the XQuery analyst.
- Classification Taxonomist. This is the rather less glamorous cousin of the ontological architect - someone who goes through written content and annotates it with semantic web markup for post-processing. This is likely to be one of the roles that junior librarians in organizations are given before moving up the ladder to being a full information architect. Additionally these people will likely be tasked to write SparQL or XQuery queries, though with less focus on low level publishing specifics and more on higher level semantically significant content.
Sea Change to XML Data Systems. The publication of the XQuery standard in 2007 has started what should be seen as a generational shift from relational to semi-structured data systems, a process which will likely continue as the XQuery Update Framework is also finalized and ratified. One of the things to keep in mind about standards such as XBRL, HL7 and so forth is that while it is possible to represent them using relational database systems, in practice, the data models that are usually generated for organizations becomes a study in combinatorics, with the increase in complexity rising geometrically with the number of levels of hierarchy. What this means is that most databases will end up either be designed from the base up as XML databases, or will have a mediation layer that will hide the relational aspect of the data behind some form of abstraction XML representation.
Within the last couple of months of 2007 mySQL released support for an XML layer that supports XPath for both querying and updating content, and I would suspect that the longer term goal is making mySQL XQuery compliant. With that, mySQL (which is far and away the most widely utilized open source database out there) joins most of the other major players in the XML space. This doesn’t mean that SQL will likely disappear any time soon, mind you, but it does mean that we’re moving into a model where XML is being positioned as the primary messaging layer between data producers and data consumers on the web. Given the role of XML in SOA systems and within syndicated systems, this change also represents the maturing of the XML messaging architecture on the web.
Companies/Projects to Watch: Hands down my favorite OSS project in this space is eXist, to the extent that I use it heavily in most of my own consulting projects. With an imminent 1.2 release (their 3.0, as eXist has been around since 2000), eXist offers REST, WebDAV, XML-RPC, SOAP, and Atom based protocols for communication with the server, has a much improved indexing model, can work effectively with Java, PHP, C++, Ruby and other platforms, and has decent performance metrics at the low-to-mid range, I see it as a powerful workhorse for GIS, content management, library systems and data publishing stories.
My only complaint with eXist is that the things that make it so successful in that space don’t necessarily scale up well to high end systems. However, I’ve generally been quite taken by the capabilities that MarkLogic has shown at the upper-end, as both its client roster and external metrics attest to. As a commercial product the MarkLogic server is a little pricy, but its performance in general is worth the price.
XForms Gets WICkeD. I’m going out on a limb on this one - I suspect that its more likely we’ll see this in 2009 rather 2008, but I think that in order for XForms to really hit its full potential, it will need to integrate the XPath 2.0 specification. However, recently with the publication of the Compound Document Format (CDF) and the corresponding Web Integration Compound Document (or WICD, pronounced Wicked!) which recognizes both XForms and SVG, that the push to beef up XForms will likely start going in earnest. XForms 1.1 by itself is significant in that it fixes a number of key problems of the 1.0 specification, but realistically, by making it clear that XForms and SVG are both considered to be integrated components of a larger W3C technology, the W3C effectively lays formal claim to its vision of a comprehensive XML framework to compete against those of various vendors.
Realistically, I think that throughout 2008, WICD will start gaining mindshare among developers as a unified concept. Compound documents represent a new class of thinking, one in which linkages and namespaces play a significant role and where such documents are as likely to be distributed across multiple servers in various states. It’s most likely that the first WICD systems will likely be implemented through AJAX-based foundations, but that what’s supported is something that has more characteristics of what we think of as a document than of an application per se.
Companies/Projects to Watch: For WICD, I think that the company to keep an eye on here is Opera. While the Opera browser has always been a tight, well designed web client, in 2007 Opera shifted into overdrive with support for SVG that’s probably the best of any browser out there, superb CSS capabilities, adequate (if not stellar) XML support, and a surprisingly nuanced appreciation for standards. My biggest gripe with Opera is that they do not yet have a decent XForms client (I’d recommend them highly in that space if their XForms efforts came even close to their SVG efforts), but otherwise they are far and away the most “compliant” WICD browser currently operating.
On the XForms side, I’ve had the privilege of working with the developers of a number of XForms projects and products over the last two years as the webmaster for XForms.org (which will be undergoing a significant facelift shortly). This has also made it somewhat difficult for me to “pick favorites” as I like a lot of what I see. Overall, though I think that there are four efforts in particular that should be watched closely.
The Mozilla XForms project is the one I’ve been working most closely with, so its perhaps no real surprise that I would pick it as a project to watch. It is quite functional at the moment, including support for some of the XForms 1.1 features that are currently being proposed in working draft, I like the ability to customize forms components via XBL, and its integrated support with CSS is generally quite good. The biggest downside comes from Mozilla itself, in that the default XPath engine doesn’t allow for Xpath extensions, something this particular implementation sorely needs.
Orbeon has emerged as my favorite server based XForms engine, and the organization has worked hard to improve some significant performance issues that were one of my biggest reservations about using it regularly. I still prefer client-side to server-side XForms as a general rule, but given the rather fractured state of the art on the client-side, Orbeon is far and away the most robust XForms solution I’ve seen server side, and I think as interest in mixed XForms/XQuery systems rise, that Orbeon should do quite well.
FormsPlayer has long been a powerful tool for XForms on the client (primarily Internet Explorer), and Mark Birbeck, CEO of FormsPlayer, has similarly been one of the major advocates of XForms (and more recently of RDFa, another very intriguing technology). Mark’s become one of my people to watch (and someone I converse with on a regular basis) because he is typically one of the first people I know to not only spot but be instrumental in pushing the Next Big Thing in the XML space, and some of the ideas that we’ve discussed lately very definitely make me see FormsPlayer as a critical company to watch in 2008.
Finally, I want to shine a big spotlight on PicoForms. While others have been duking it out on the browser platform, PicoForms has been quietly building up a solid reputation in the embedded mobile market for Xforms component production, recognizing (correctly) that if forms are problematic in browsers, they are downright primitive on the mobile web. My feeling is that XForms will likely be well established in the mobile space (as has SVG before it) before it makes it in the browser space, and I think that PicoForms has a nice position to shape that market.
The Semantic Web Becomes Real. Okay, so maybe I’m getting a little cynical about the semantic web myself, perhaps because too much of it of late has been so heavily tied into RDF, but I think there are some indications that this is changing. RDFa (or Attribute-centric RDF) provides a way of using a microformat like approach to add RDF triples and related items into both HTML and XHTML code, a significant step forward from the way that RDF is used now. There’s a growing movement afoot to use a more XQuery-like rather than SQL-like language for performing RDF SPARQL queries, and I’d not be at all surprised to see a formal sparql: namespace with an appropriate XQuery API published by the W3C within the next year or so.
Again, what this points to is that after more than a decade where each of the pieces of the W3C domain existed largely in independent silos, the realization has been made that in order for these technologies to succeed in the long run, they have to be integratible. Semantic web information is again simply data that needs to be queried as data, needs to be rpesented in a cohesive fashion, needs to be transported acrosss a suitable set of messaging protocols. In order for SemWeb information to become a part of the web, it needs to place nice with the rest of the web; there are signs that this is in fact happening, and this openness (and getting it out of the hands of academics and into the hands of workaday programmers) will go a long way in making the semantic web useful for the average web developer.
Companies/Projects to Watch: I’m going to pass in this space for the moment, as I haven’t seen any SemWeb companies yet that clearly differentiate themselves from the pack. However, I’m likely to come back to this list in my mid-year review.
Atomic Power. I think that AtomPub is about to transform the way that we work with web data. This is a pretty profound statement, given the large, vested interest of all of the SOA vendors out there, but consider this: Let’s say that you are working with an XML data document - such as an invoice, possibly through some kind of XML editing application such as XML Spy, oXygen or XML Stylus. You post the document via AtomPub to a web server, which processes it and adds it to an XML queue (for that matter, the publisher could be a stand-alone application, an XForms page, an extension in a web browser, it really doesn’t matter what the client looks like).
Once you post it via AtomPub (in a manner analogous to the way that you would post a blog) the entry’s publication information is recorded, and you can make an Atom news feed that will show all of the recently posted invoices, possibly either directly or more likely through a parameterized search and filter mechanism. That feed in turn can be read by any news reader that has the appropriate permissions - so that you can read the latest invoices as records in your Google home page, shown in Drupal or other desktop portals that rely upon syndicated feeds, or perhaps in a stand-alone widget. Clicking on the link will bring up either the view page or edit page for that content in the appropriate editor, which can then cycle back through the process.
I’ve been working on a system like this, as have several other people that I communicate with on a weekly basis, and the results are rather astounding. There are signs that this kind of an architecture is making its way into bigger corporate enterprise stacks, a marriage of XML databases, syndication services, XML-client frameworks and a comprehensive XQuery search capability - expect this to become increasingly the normative for larger scale architectures.
I’m anticipating that you’re going to see both SOA based systems with SOAP as the primary messaging standard and Syndication Oriented Services (SynOA) with Atom as the primary messaging standard develop alongside one another, perhaps with an overall merger of the two systems by the end of the decade. The two architectures are not all that incompatible, and there are places where each serves a specific function better than the other, which is why I don’t necessarily see there being a lot of benefit in seeing an architecture war in that space.
Companies/Projects to Watch: This is, unequivocably, Mozilla’s game right now, though Adobe has also been fairly heavily involved in the process as their ActionScript usually remains a functional superset of the ECMA standard. The best place to keep appraised of activity here is the ECMAScript home page at http://www.ecmascript.org/.
A Standards Compliant Internet Explorer? Possibly. I recently saw a piece indicating that the IE8 pre-alpha has just achieved 100% ACID 2 compliance, which is highly encouraging. At this point, Opera 9.5, Firefox 3.0b2 and Safari also pass, which means that for the first time, all of the primary browsers can now implement CSS 2.1 properly. Okay, mind you, CSS 2.1 was standardized nearly a decade ago, so again you need to forgive me for being just a wee bit cynical about this level of standards compliance. What this does indicate is that there may in fact finally be a recognition on the part of Microsoft’s product managers that standards do matter (even when they don’t originate with Microsoft), and so may also indicate a growing willingness to stop trying to break the web and start working with the various standards bodies, rather than trying to reinvent the web to fit their vision of it.
Actually, I think there’s a lot of conflicting signals coming out of Redmond right now, which suggests to me that there are actually some significant arguments within the Microsoft campus as to which particular strategy to follow. Certainly, their existing one is not winning them friends among either developers or, truth be said, customers, who are increasingly finding that they have to maintain two separate code bases to support Internet Explorer and other browsers simultaneously. As the AJAX stack continues to gain complexity, this cost gets to be significant both in terms of performance and development time.
Internet Explorer is a somewhat troublesome product for Microsoft — they would prefer that it go away, that they could turn the desktop into the browser and export the Microsoft version of the web (Silverlight bells and whistles and all) to the rest of the world. That was in fact the original (long lost) goal of Vista, and the fact that they have fallen far short of that is now causing a significant amount of restrategizing in Redmond, Washington. Somewhere along the line, they will need to come to an accommodation with all of the other players on the web, or risk marginalization in one sector after another. IE is critical for that, primarily because anyone in Windows is now spending far more time proportionally working within the context of a web browser than they are any other product, making the product a fulcrum for getting people into Microsoft Office, working with Outlook, connecting to Exchange Servers and so forth; lose the browser, and everything else will collapse.
Thus I see by late 2008 or early 2009 an earnest attempt to adopt other W3C standards and attempt to become the best browser on this new baseline. It likely won’t last long, if history is any indication, but perhaps it will last long enough to insure that there actually is a solid baseline.
There are a few additional notes here - the Eolas patent license, which forced Microsoft to make ActiveX controls “click to activate” has been resolved - they paid the license fee. This reverts ActiveX components back to the original mode where they would activate in Internet Explorer when the page loaded. This isn’t necessarily a good thing in that it once again makes it possible for rogue ActiveX code to run before any user intervention, but at the same time it makes ActiveX based applications viable without requiring sometimes byzantine user interaction. Whether the number of viruses attacking IE will rise dramatically as a result of this remains to be seen.
A mention on Silverlight is in order here. Microsoft has been quietly giving its blessing to a Linux version of Silverlight entitled Moonlight, which would effectively migrate the XAML codeset over to the *nix environments. It’s questionably about how efficacious this effort will be, but given the degree to which Mono has made its way into most of the Linux distros, I think that it would be foolish to ignore the potential for an open source Silverlight in this space - certainly it will be newsworthy in 2008.
MS LINQ, XQuery, XSLT2. While on the topic of Microsoft, let me address a few of Microsoft’s internal XML technologies. It has become obvious from a fair number of reports in recent months that LINQ has become the favored child of Microsoft’s XML technologies, and as such has largely pushed off the development map completely any native XSLT 2.0 or XQuery 1.0 processors (outside of SQL Server) from the roadmap. This means that the likelihood of a Microsoft XSLT2 version coming out in 2008 or even 2009 has dropped from quite possible to highly unlikely.
LINQ, a lightweight XML model that is also intended as a general data-binding solution is not, in point of fact, that bad a technology - overall it solves some very real problems with XML manipulation, works well across the various flavors of Microsoft’s flagship languages C# and VB.Net, and introduces some much needed functional concepts into the highly imperative design modes that Microsoft employs for its own projects. I fully anticipate that LINQ will continue to make its way into other parts of the Microsoft space, though to date I’ve seen no indication that it will migrate into Internet Explorer 8.
However, it is not a general XML solution - it doesn’t deal well with complex transformations in particular, an area where XSLT2 is especially strong (and where XSLT 1, even with the fairly extensive addons that the .NET version supports, is both verbose and cumbersome in comparison to its descendant). Whether XQuery support is needed is debatable, though I personally think a strong case for XQuery support in ASP.NET can definitely be made if my experiences with eXist are any indication. With several key XML people (and XSLT advocates) leaving the XML data group in 2007, prospects are looking rather dim for Microsoft support of these key technologies at this time.
Messaging is the life-blood of the web, but there are compelling reasons for all three formats. JSON has some fundamental problems in being able to express document like structures, a limitation that arises from the way that names are expressed, but it has some strong advantages for working with standard OOP objects. SOA-based systems do not integrate terribly well with web clients because of the complexity of setting up SOAP messages in that environment, but they work especially well in high speed services environments, while Atom based SynOA works reasonably well as a transport mechanism but has relatively little penetration in terms of existing server support (though that’s changing).
I’m anticipating that sometime in 2008 it will work to the best interests of the W3C and other standards organizations to come to some kind of understanding about how to make sure that all three can continue to work effectively in a concerted manner, rather than competing for mindshare and code resources. This hasn’t been a big issue yet largely because the three environments have been largely isolated from one another, but as their circles overlap the issue will become much more important.
Mozilla Retrenches. Mozilla is going through some serious growing pains at this stage, and those will continue well into 2008. On the one hand, they will be releasing Firefox 3.0, which offers some much needed improvements in performance (especially with the rendering engine) and no small amount of changes within the extensions architecture to make them less of a drag on the overall system. Moreover, late in 2007, Mozilla split off the Thunderbird mail client into its own development company (code name MailCo, though this will almost certainly change) - a move that will let the core development team concentrate on browser functionality.
Unfortunately, Mozilla faces serious problems in 2008 - competent Mozilla programmers are becoming harder to find, and the changes to the code base will mean that even those that are up on older versions of Firefox will likely be facing a learning curve to becoming proficient with 3.0. The slowing economy is affecting ad revenues from Google (still one of the primary capital injectors for Mozilla Corp.), and Mozilla’s own plans for off-line support are coming into conflict with Google’s plans for Google Gears. The meteoric rise in market penetration has slowed (though by no means stopped) against Internet Explorer, while rivals such as Opera and Apple Safari are ratcheting up their own promotional efforts.
Mozila is also suffering from common fates shared by many open source projects - factionalism and ennui. The factionalism is making it difficult for the organization to establish a clear direction moving forward, and a number of Mozilla “core peripherals” projects (such as XForms, SVG, the embedded browser effort and so forth) are facing a critical developer shortage that is only exacerbated by turf wars and the well-known propensity for top developers to have trouble delegating.
I’m anticipating that this will be a retrenchment year for Mozilla - a necessary house-cleaning that will need to take place organizationally before the organization can move forward with a 4.0 version of Firefox. I’m still fairly bullish about Mozilla in the long term, but I suspect that their fortunes will likely only pick up near the end of 2008 as developers who have been working hard on getting 3.0 out the door can be shifted to other projects.
Google Grumbles. I’m not seeing a fantastic year for Google, nor even all that good a year, though I don’t think it will be as bad as some analysts preduct. Google is still very much dependant upon the ad revenue side of its business, and the slowing economy is translating into a significant slow down in ad revenue as well. This will likely not translate into a huge cut in their operating capital, but its ikely that you will start seeing some cost-cutting going on in their campus-like headquarters in Mountain View, California (though I suspect that the same slowdown will affect other competitors far more).
I am expecting to see several things from Google this year. Their recent announcement of the Android SDK and its spearheading of the Open Handset Alliance both indicate that they see this as their opportunity to break into the lucrative mobile market in a typically Google-esque bit of JuJitsu - not by fielding their own product, but by changing the rules of the game. Whether the alliance effort succeeds remains to be seen, though with the wild success of the Apple IPhone, the old hegemony of phone manufacturers has shown themselves to be vulnerable.
I suspect that recent rumors of Google also getting into the commodity hosting service will likely prove true - it’s consistent with their recent forays with Gmail and their online office suite, it poses a serious challenge to Amazon (which is doing surprisingly well with its S3 services) and it helps promote an enterprise wing to their organization. The challenge to pulling it off may very well be physical - I keep seeing this science fiction-like vision of vast windmill farms that exist solely to support deep underground vaults filled with servers out in otherwise desolate wastelands, with the heat from those servers in turn powering steam turbine generators that add to the electrical power grid. While I suspect that Google will be a net power consumer for some time, I have no doubt they will manage to turn even electricity generation into a revenue source.
What I have more doubts about is their capability to produce good software. OpenSocial, their social networking API, has met with both yawns and jeers, both because of some questionable design issues and because of the concern that perhaps Google is seeking to promote OpenSocial largely as an effort to undermine Microsoft and Facebook. I also suspect that the social networking market right now has too many dollars chasing it for a formal open source initiative to be that successful - they would be better to shelve it now and wait for a couple of years as the Social Networking space collapses (see below).
Their online office offerings (ooo?) have also been less than completely successsful; while innovative, many people, including myself, have been underwhelmed by the lack of functionality and the sometimes unstable interfaces, especially in comparison to such things as Zoho, which I add as another Company to Watch. Zoho’s interfaces are clean, reasonably fast, full-functioned while not being confusing, and quite sophisticated, and given that you can do the “standard” office suite applications with it - word processing, spreadsheets, presentation programs, Wiki content and so forth make the suite most attractive in this regard, especially given its platform neutral stance, browser or system wise.
The one area where I see Google continuing to exert influence will be in the GIS arena, though I suspect that a lot of this will be behind the scenes rather than consumer-facing. Google’s had a thoroughly disruptive influence on geographic information systems, with Google Earth and Google Maps combining to lay havoc in what has been at times a fairly hidebound industry. Another initiative to watch closely here is the formation of the OpenGIS Consortium (or OGC) as well as the rise of open standards such as geoRSS.
Dare Obasanjo wrote in a recent column about 2007 being the year that Google Jumped the Shark. While I respect Dare’s skills as an analyst and commentator, I’m not sure that I agree here, though I do think that he’s at least effectively pointed out that Google’s experiencing many of the pains that a maturing company tends to be prone to - an erosion of “geeky coolness”, the first wave of high profile defections, making strategic mistakes where the company has chosen to be too conservative or not to stretch themselves as much. I’d add to that a growing case of Not-Invented-Here syndrome, where a company feels that any project that doesn’t originate in its halls isn’t worthwhile. It may be time for Google to become more acquisitive of smaller startups (and play more of a VC role in that regard) - its size, market position and inertia make it harder for Google to be the innovator, but it could (and should) focus more on incubating the talent that is out there.
Yahoo, AOL, Amazon. I’m lumping these three together for the sake of narrative convenience, not because I see the state of the these three as being anywhere comparable. Yahoo’s had a rough year in 2007, but that which doesn’t kill us makes us stronger, and I think that Yahoo may actually end up being in a stronger position in 2008. Yahoo’s problems arose in great part because they spent too much time trying to masquerade as a social network without ever really managing to “catch” in that regard - Social networks are a lot like celebrity … there’s usually one or two darlings in the media, but once the cool people move on, its damnably difficult to woo them back (and those cool people are SO fickle).
Yahoo’s returning to its roots as a search technology , and I think it may have considerably more success there. After the high profile exit of Adam Bosworth, Google’s health initiative basically collapsed, and elsewhere health services have been a non-starter for other large companies. However, Yahoo seems to be doing it right, and is recognizing that to do so effectively you need a way of marrying structured and semi-structured storage with security and largely automated presentation. I think that online health care may be the killer app of early 2009, and Yahoo has the inside track there.
AOL, on the other hand, is fast becoming a basket case. Chronic layoffs, poor strategic decisions, lackluster (and in some cases just bad) technology, a declining user base, all have contributed to AOL’s continuing downward spiral. AOL has become the computer company equivalent of the CBS Network, catering to an increasingly elderly population and finding it harder and harder to attract quality talent or come up with “hot” technology. AOL got a cash infusion from Google in 2006, but this has only postponed the inevitable … AOL is burning out, and the only thing that has kept it afloat has been that it continues generating ad revenue in excess of its losses from poor management.
Amazon, on the other hand, has managed to be rather delightfully quirky during 2007, and I suspect may continue doing so into 2008. The bet has taken awhile, but the impressive numbers coming from online sales have proven Jeff Bezos finally to be quite prescient in his original vision - even as brick and mortar companies succumb to the expanding credit crunch in the US and elsewhere. My feeling is that Amazon has now become positioned as the 21st century equivalent of Sears through much of the 20th century - a general merchanter selling decent product to middle class consumers and dominating its market completely. (Of course, I also think that Sears is not likely to last another ten years, so even this strategy ultimately has its limits).
Amazon has also managed to become a giant in both the web services and virtual server markets - While it had a few birthing pains with its S3 hosting services, they have very quickly become the benchmark that other companies are trying to emulate in that space. They pioneered the use of web services APIs that are now becoming commonplace throughout the industry, and every year they come out with ideas that are original, if not aways profitable (the unfortunately named Mechanical Turk comes to mind).
The Coming Social Networks Meltdown. Online communities have been a staple of computing since well before the Internet itself came on the scene - indeed, much of the proto-Internet came from bulletin board services (BBS’s) that saw the new HTTP protocols as a way to simplify their operations. Thus, I’m always rather amused when the next generation rediscovers (and usually renames) Social Networking.
The problem with Social Networks is the aforementioned cool factor - they are the equivalent of hit TV shows. A few early adopters find a cool niche and jump to it, and the key trendsetters in turn bring in more and more of their friends. Then, alas,it becomes harder and harder for the network in question to keep pumping in value fast enough to satisfy all of the new people, and moreover as this process continues, the edgy and controversial gets cut in order to satisfy the mainstream. The trendsetters move on, tired of being lumped in with everyone else and fully aware that where they go, others will follow. Of course, by this time, everyone else realizes what an incredibly easy racket creating social networking services really is, and you get dozens and dozens of me too sites that languish for lack of participants, and eventually all of those investments meet the hard reality of actually making a profit, and they fold (and large companies spend obscene amounts of money trying to buy up these ephemeral bubbles before they actually pop).
We’re about to go through one of those folding periods. MySpace has become passe, Facebook is showing strains around the edges, and the names of companies that are emerging now sound more like advertisements for pharmaceutical drugs than they do real companies - Plaxo comes to mind on that especially. Google’s OpenSocial initiative has gone nowhere (nor was it likely too - there’s still too much wish-fulfillment going on at the investment level for peope to come together).
2008 is going to be brutal for the Social Networking space, and this is one of the few areas in the tech sector where I see the larger global recession seriously impacting IT (see Economics, below) (advertising is the other). My suspicion is that there will be a couple of large survivors that will be the Yahoos of tomorrow, though whether those include either mySpace or Facebook is debatable. As a guess, I’d point to Linked-In as one to watch in this space - they continue to execute smartly, they understand their market, they have stayed relevant for a couple of years (always a good sign) and they continue to offer considerable benefits for the participants. Moreover, as the economy worsens, there are worse things to be than a service that lets you trade on your connections to find work.
Open Source and Open Standards. This will, once again, not be the year of the Linux desktop, though I suspect that may not be true for much longer. In a weakening economy, price point becomes a major factor, and by more than a few indications Linux-based computers in the $300-500 range (and sub-$800 notebooks) are likely to continue growing in appeal, though distribution channels are still fairly heavily tilted towards Windows. The rise of virtualization (something that was one of the big stories of 2007 and will be even more so in 2008) has also meant that it is becoming feasible for even non-techies to run multiple OS’s without having to deal with the hassles of reboots or multiple machines, and it additionally means that developers are becoming expected to be proficient in as many OS systems as possible. While still not common, I know anecdotally that I’m seeing more Linux installs on laptops in public.
Open source software tends to wane when the economy is good and wax when the economy turns rough - both because of the cost factor in development and because in general monetary motivations are higher in good economies, while it becomes easier to put something out for free when nobody’s buying or selling anything. A lot of OSS projects have been relatively starved for talent because of the developer crunch, and I see that trend continue into 2008 but start to break by the end of it if the recession (see the next point) spreads beyond the housing and financial sectors (which I anticipate will happen).
As to open standards - I think we’re going to see a period of consolidation and integration within the W3C after a few very productive years. The XPath family of languages are largely complete and solid (though I see the rumblings of an XForms 2.0 in the near future), the Semantic languages are close to complete, the initial burst of activity to align HTML with AJAX will continue but within a clearly defined timeline, and the mobile initiative set out a couple of years ago will likely run its course in 2008 or early 2009. My anticipation is that there will likely be a drawdown of activity by the W3C over the course of the next couple of years, at least until the next major wave (which I suspect will be SemWeb driven, starting in 2011).
Pay close attention to binary XML, specifically the Efficient XML Interchange (EXI) format. This format’s reaching concensus within the organization, and it will likely debut in the summer of 2008. EXI effectively reduces the size of XML data streams significantly, in a way that is also more conducive to rapid parsing and serialization of XML, and will likely have a significant impact on SOA systems in particular. This will in turn spur a mini-burst of activity as EXI enabled XSLT, XQuery and similar tools emerge, though I don’t see this happening much before 2009 at the earliest.
Outside of the W3C, I expect ultimate ratification of Microsoft’s OOXML standard by ISO in 2008, though the vote will likely be close. I’ve made my own (generally negative) opinions be known about OOXML, but also think that at this stage the combination of a valid need to provide some formalism on Microsoft Office and a potent (and likely somewhat ethically challenged) campaign on the part of Microsoft will likely see the specification passed.
An Economic Credit Crunch and IT. This is where I get to wear my economist hat, and thus make totally outrageous statements that I can hopefully ignore next year because no one ever gets economics right. Seriously, though, my biggest economic concern in 2005-2006 was the erosion of the dollar that I saw over the next couple of years and the looking impact of the housing market. It was a big part of the reason that I moved from the United States to Canada nearly three years ago, and in retrospect it was the smartest decision I could have made.
When I moved to Victoria in 2005, the Canadian Loonie bought about 78 cents American. Today, it buys $1.02 American, after briefly spiking as high as $1.10 a few months ago. As an investment, I made a 31% profit. The reason that this happened was that the actual cost of borrowing money was effectively zero - there was no risk in borrowing in the short term, or so it appeared, because interest rates were effectively below the rate of inflation. This led to a huge degree of malinvestment - why worry about spending money when effectively it costs you nothing to borrow it - primarily in real estate, but secondarily in equities and credit markets.
In August, even as the credit markets were seizing up due to bad mortgage investments, the dow jones industrial index hit a nominal all time high above 14,000. However, from anywhere except the United States, stocks are still underwater if you invested in the market in 2000, and moreover, are only at about 75% of value when adjusted for inflation. There was no question about whether this would happen - it was obvious a couple of years ago. The only question was whether inflation or deflation would first. The answer seems to be that they are hitting at almost exactly the same time.
The danger that’s faced here is pretty profound, and likely to become more so given recent geopolitical developments. In 2001-2002, the Fed lowered interest rates to historic levels in order to stave off the potential that the tech recession would spread to the rest of the economy. It worked, but at the cost of building up the housing bubble. The Fed could do the same trick again now as the housing sector is pulling the economy down, except for one not so minor problem. Foreign investors now own a considerably larger amount of US Debt than they did even in 2002. Should interest rates be lowered to fast or too low, this will cause the dollar to drop significantly in value. While it increases the possibility of exports (and hence helps the balance of trade to a limited extent) this drop correlates directly to increases in prices in foreign produced goods and materials, most especially including oil, natural gas, and many other resources, which has in turn led to a dramatic food and fuel inflation. Also note that among those selling is China, as their treasury officials have indicated that they will continue to divest their US treasury holdings and are letting the renminbi rise relative to the US dollar in order to combat the export of inflation to that country.
On the other hand, if the Fed raises rates, it exacerbates an already severe credit crunch, meaning that it becomes much harder for both individuals and companies to borrow money. This causes strangulation of the economy both at the top (the distribution of money) and at the bottom (the ability of people to spend money, and hence keep the great capitalist scheme running). The last time this happened was in the late 1970s (due largely to the move of the US off of the Breton Woods-based gold standard in 1973), but in a number of critical ways, the situation now is closer to that of the earliest years of the Great Depression of the 1930s, or the Japanese recession of the 1990s that only recently have they managed to recover from.
Now, there are a lot of variables here that weren’t in place in any of those economic slowdowns, but by my reading there are very few economists, even bullish ones, that do not see a recession coming down the pike in 2008 - the question is only one of severity. And this is where I get back to the subject at hand - the IT sector. One of the things to keep in mind about recessions is that they do not affect all industries equally. The Tech Recession of 2001-2003 is a prime example.
If you were in an area that was only peripherally connected to the tech sector, you might have been pardoned for wondering if in fact there was a recession at all. On the other hand, if you were in IT, chances are at a minimum you lost a considerable amount of paper wealth, and at worse found yourself unemployed and unemployable with no way of making the house payments or even getting enough food to eat. The sector has yet to fully recover from that, though the recession actually did what recessions are supposed to do - it cleared out excess capacity, exposed fraudulent activity, and made people considerably more aware of both the strengths and weakness of technology and its associated industry.
For programmers, it also made it possible to do what they do so well - gave them the time to invent, explore, communicate and learn without the pressures of having to get a given product out yesterday. It turbo-charged the open source movement and made it possible for some of the most innovative products and projects to come to fruition despite the efforts of corporate naysayers and marketing managers who just didn’t get it. I doubt seriously that most programmers would prefer to go through that again, mind you, but in the long run it strengthened IT in ways that would have been possible in the go-go 90s.
What’s coming down the pike today will affect a lot more people, but ironically enough, except in a few areas, it is likely to have relatively little effect on IT. Unlike in the late 1990s, the internet departments of many companies in particular have become their primary outlet for doing business, even as brick and mortar sales have dropped. Demand for programmers and program managers remains high even in the face of dropping payrolls elsewhere, and the supply of programmers is now at 1996 levels, though spread out among far more technologies.
The weakened dollar also contributes to the strength of IT in the US, as it makes “off-shoring” a considerably less attractive option for US companies facing rising labor costs abroad due to exchange rates and localized increases in salaries in indigenous markets, while at the same time making US programmers more attractive (of course, for Canada, that’s not necessarily good news, but that’s the grist for another mill).
The sectors within IT likely to be most vulnerable are those in consumer categories (though my expectation is that this will manifest less in reduced sales in more in increased piracy, especially with respect to computer games), and advertising sensitive companies (though again, this is less likely an issue for online advertising - which is somewhat underpriced - as it is offline media). It also tends to make high end commercial business software less attractive, especially if open source solutions exist, so its likely that you will see a continued deflation of prices for software costing $10,000 or more.
From a larger business perspective, expect mergers and acquisitions to slow considerably, as the cost of raising capital for such sales becomes prohibitive - which also means that it will become harder for large companies to buy out start-ups, especially given the eye-popping sale prices in recent years. Furthermore, expect that most software companies will likely go into a neutral hire mode - hiring a replacement programmer for each programmer who leaves a company - but will be unlikely to expand their IT base much if at all … at least until they can determine what the risks are to their bottom line of increasing body count. This also means that the time between jobs for independent contractors will probably rise by a couple of weeks or so, which means that programmers should become a little more wary of job-hopping in order to get better wages, and should price that into their anticipated rates.
Overall, 2008 should prove to be an interesting, if somewhat nail-biting, year. There are signs that XML is maturing at the enterprise level and is beginning to make its presence felt within web browsers and web interfaces (especially beyond simply being local data islands or data stores), and XML is also beginning to become a solid data technology in its own right, rather than simply a messaging or “document” format. In general, the coming year should prove not to have too many huge disruptions, but it will see a number of standards that have been in the works for several years now start to become widely deployed, including in the Semantic Web, XPath-family, and compound document arenas. I’m lost optimistic about proprietary XML client frameworks - they will continue achieving some market penetration, but likely not as much as their marketing departments would like to project. Beyond that, macro-economic trends will begin to have an impact upon XML and IT in general, especially towards the latter half of 2008 and early into 2009, though probably not as dramatically as in years past.
So, to one and all, I hope that your 2008 proves to be enlightening, enjoyable and prosperous, and wish you all, my readers, the best.
Kurt Cagle is an author, information architect and analyst specializing in XML and web technologies. He lives in Victoria, British Columbia.