"Think of the indexes as *between* the elements..."
What an absurd way to think of an array index. An array isn't a continuum, it's a numbered sequence of discrete items, so the rest of your "(0,1] [0,1]..." argument is irrelevant.
How do you number three items in every other domain in life? 1, 2, and 3. The first element of an array of items should be array, the the Nth would be array[N]. An array whose highest legal subscript was N would have N elements.
Easy to count, easy to refer to, the least suceptible of all systems to off-by-one bugs.
this confusion comes from the fact that array subscripts in close-to-the-metal languages like C are just syntactic sugar for memory offsets. Naturally, the first element of an array starts where the array itself starts, so its offset is zero. my_array(N) is just my_array + N*sizeof(element). It's a pointer to a memory address.
Unfortunately, from the very start, the most used languages have been low level, like C, so programmers have just gotten it in their heads that arrays *ought* to be zero-based. Perhaps they should be in such languages, but they carry that legacy mindset with them even when they move on to high-level languages where memory offsets are irrelevant -- to where you are simply labeling a sequence of items. If language designers now try to label array elements 1, 2, and 3 as 1, 2, and 3 (instead of 0, 1, and 2), though, the experienced (read: "those with the most legacy baggage between their ears") scream about it, coming up with preposterous, strained arguments such as the one above, appealing to "mathematics" to explain to gullible people why the ordinary counting everyone uses for everything else in life is somehow not acceptible for counting elements in an array.