@kierenj said:
XML programming languages are a new thing?
What about XSLT? It's a functional language, sure, but still a programming language.
Also IMO: XML is good. Doubley so with .NET. I spend all day writing .NET solutions, and the ability to persist a DataSet / DataTable to disk for permanant storage with "WriteXml" or similar is great.
I agree that having a persistence system is a good thing, but you don't need XML to build a persistence system. It would be just as easy to use if it persisted the data in binary, S expression or anything else, hence XML doesn't provide much added value here.
Great because (1) you can directly transform that XML into reports etc using XSLT,
And you can transform anything else into reports using any scripting language. I give you that XML make it a little easier to parse the input, but most of the work is still to make sense of it and restructure it into your report, and unless XSLT is magical, I don't see how it would make it that much easier than another language.
(3) if it all goes wrong, just open up the XML file - you can read the data in there yourself.
I work with binary files a lot in my current project, yet the instances where I actually have to open them in a binary editor to see what they're like are quite rare.
An human-readable format is useful, but you can have it in other ways than XML anyway.
Is there anyone here who can provide an example as to where XML hasn't been useful? i.e. it's hindered them?
Some people here wrote some call-graph and memory usage tools tailored to work with our project. They output XML files.
They are so huge that they are an horrible pain in the ass to extract anything from them. It takes several minutes to open some of them.
It would have been so much better to use a lightweight RDBMs like sqlplus to store this data.
XML doesn't scale well when manipulating large amounts of data. Therefore, choosing XML over something else can often be a dangerous idea.
I could also mention that we have a lot of small XML files. We use TinyXML (a small and nice C++ DOM implementation).
We're often loading a lot of these files. It is slow and it leads to a lot of unecessary copying and messing around with the data in memory to turn the DOM into the bunch of objects we're actually using. It also leads to allocating and freeing lots of memory blocks, which is a big performance turnoff.
Again, it's caused by solutions that look sexy at first glance, and the "well they provide tools, let's trust them that these are good tools and just use them" mentality. XML is actually full of traps like that, and the amount of people always heralding it as the bestest possible way to store data isn't helping.
Also, I could mention problems caused by the "XML is a silver bullet" mentality in a previous company I worked for.
We were developping two games, one on PS2/Gamecube, the other one on Gameboy advance. Both were similar adventure games (obviously with very different representations).
We made a custom scripting language for the GBA version (I would rather have used lua, but some people there had decided that it wasn't efficient enough...).
The PS2/Gamecube team decided to use the "Universal XML solvent" on the problem and to just throw together a finite state machine engine, and use a XML description thereof as a scripting language.
Defining a simple interaction like making so that moving a lever would open a door took like 3 lines in our script language.
It took them something like two pages of XML. And they also had to build a system to store the XML data as binary files because the parser was wasting too much memory/cpu time for the game consoles.