MacBloQs |
Powered by TagBoard
|
Saturday, October 12, 2002
Building the next OSX - part 1In one sense this BloQ is a continuation of the ".... and other GUI terms" series, but it is inverted since the goal is to suggest outlines for the kind of UI that OSX could have been. I wish Apple had spent some time thinking their UI philosophy through before developing OSX, rather than making something that is an almost totally reactionary, anti-Mac hodge-podge, but you can only bicker for so long... Many of the suggestions run parallel to what David K. Every wrote recently on iGeek about interface flexibility - the same piece I linked to in the last-but-one BloQ. It is more in-depth - and better written than this BloQ -, and you ought to read it first to enhance your understanding of these musings. For now we will focus on the externals, on the mechanisms by which the user interacts with the data; the next BloQ will discuss information storage and -structuring. GUIs are square, not only in the 'seventies sense, but also because they have to be so in order to optimize the limited space offered by a standard monitor (irrespective of size and resolution). Kai Krause revolutionized interface perception by releasing his Power Tools programs with interfaces that were non-square, with organic forms and a distribution of control widgets that were far from functional in an efficiency sense. Instead they appealed to the aesthetic senses of the user group the programs were made for - graphics people -, and since the programs offered hitherto unseen (and undreamed of) artistic possibilities, the "alternative" interfaces were accepted. Krause has disappeared inside a fortress (or a monastery?), there to think thoughts that might come out some day as a totally new kind of Interface.... The acronym WIMP has fallen into disuse, but it used to characterize the basic ingredients of the Xerox-derived, Apple-developed graphic interface: Windows, Icons, Menus, Pointer. But it also covered a perceptual paradigm of computing interactions: computing is a holy Trinity of the Hardware, the Software, and the Data - data being manual input - with whom the worshipper interacts through a set of prescribed rituals (data manipulation by the set procedures of the software). And the software was in the centre. As Jared White succintly sums it up in his discussion on user interfaces in general: a computer is by tradition not data-centric; it is program-centric - and this concept immediately creates distance, estrangement to a new user. When you go to the workbench, you put wood there and begin to saw. When you want to write, you sit down at the table with a piece of paper and begin penning things down. Even for a seasoned computer user a paradigm including this principle will be more natural and faster to use. But several factors have changed since the development of the first GUIs, including familiarization and data amount. The data amount accessible for the average user is no longer primarily his own inputs; through the internet, CDs, DVDs, cameras and scanners, the amount is huge, and getting huger, But well over 60% of the adult population in the Western world knows about basic use of WIMP-based computers, and this means that the Desktop metaphor has become dead: the concept carries its own signification, rather than alluding to the original denotation. Parallel to this, the strictness of the Desktop metaphor has diminished, and a long list of capabilities are with no physical equivalent or parallel, making metaphorization near impossible. Thus any GUI today is restricted only by three demands: internal consistency, compatibility, and efficiency (standardization/flexibility) During my years with computers I have had three "epiphanies". The first I'll return to in the next installment of this series and the third I have forgotten for the present, but the second will be the basis of these musings. Here it is: 2. Programs and data are the same: binary streams processed in the processors. Once a user has grasped this simple fact, a new paradigm can be established. The basis of this paradigm is, "Every UI detail is a Display". Display as in Display window, that is. For the moment, think of it as an extended windows metaphor, the extension being based on what was said above: Commands and data are both binary strings, and the distinction is to some extent merely pedagogical. A Close widget on a window is a granule , a binary string (or, if you will, a "command granule"). An image is a granule, a binary string - a data granule. Again, the command/data distinction is pedagogical rather than essential. This new perception paradigm is aided by the *nix system of granulating the command structure to such an extent that many CLI commands (including parameters) are really the activation of a single, specialized program. The OpenDoc product was strangled into extinction by silence mainly due to economical considerations done by large corporations, partly because users were not ready for the paradigm. The 'nineties acronym of choice was OOP (object oriented programming), and the motto was: "Everything is an object". Now we bring it into the second millennium by baptizing it "granule", "formation" (a collection of granules) and "Display". How does this work out in a user interface? First of all, the fact that all command units and data units are Displays provides an internal consistency that makes for efficient handling, once the user understands it. A text is a data granule, shown in a Display. A group of texts are shown as minimized Displays inside a formation Display. The Menu bar is a formation Display, with other formation Displays (the individual menus) embedded. The Dock is a Display (from now on I will leave out the formation bit, and only specify when it is a Display of a single granule). A program is a huge formation of formations of command granules in Displays - the term program is here used for the sake of didactic compatibility, and because the industry will take a long time moving away from the concept of program monoliths. Now, let us turn it around: A Display can be made to present itself as a menu. Or as a Dock. Or as a control strip. Or as a palette. To quote David K. Every: "The user has their dock(s) that has the apps that they'd like to run. They can drag them to the upper left corner (like the Apple menu used to be) and viola: the menus scoot over (give it room) and they have a menu of their applications up there. Drag it out to a palette or move it to the edge of the screen for a dock." So, a Display can grow handles, according to placement, or scroll bars, or widgets. It can minimize or extend itself, according to this placement: menus scroll down to full length, pop-up windows pop up and might have scroll bars, floating palettes... float, minimized windows expand, etc. etc. Look at your interface and consider every square unit a Display (window), but with slightly different presentational and behavioral rules. Think back to the good old Application Menu - it could be torn off and put as a floating window anywhere; it could represent its content with text, icons, or text and icons. As so often, Microsoft tried to be smart by extending the concept: everything could be tear-off menus; however, they didn't go the whole hog and so ended up with a confusion of concepts. You can make everything into Docks, or Menus.... or put everything in Displays that reside behind handles along all the display edges, like a combination of popup windows and many control strips. Displays can be put on top of each other so each of them becomes a layer and grows tabs - Photoshop didn't invent this tabbing, incidentally: an incredibly advanced Atari ST dtp program named Calamus did (but that's another tale). Formation Displays can be put "magnetically" next to each other. It will be possible to attach program granules to Displays in such a way that their behavior can be extended or modified, in a similar fashion to the recent ability to attach AppleScripts to folders. Thus new interface ideas can easily be incorporated, for enhanced individual efficiency. The Dock and the Apple Menu both make it possible to mix programs and files, or in X2-speak: they are Displays that include both command granule formations and data granule formations. This will also be possible in the new kind of Display windows, although the previously mentioned problem with companies insisting to sell only monolithic formations, or at most collections of very large indivisible formations might limit the flexibility of this. A program from, say, ..... Adobe (to take a reasonably neutral example) could structurally be seen as one solid complex of commands, or it could be seen as multiple formations of command granules sold and installed as one heap. Now, if people begin thinking about products as the latter, they would clamor for the possibility of choosing only the granule formations that they wanted: the Rotation granule of Photoshop, for instance, is faster than that of GraphicConverter . On the other hand, GraphicConverter's Scaling is more efficient than Photoshop's. Now, aside from the questions of ease of pirate copying, large programs have a large overhead because they are large. Smaller plug-ins, for instance, may demand relatively more coding time because they are more specialized, but people are not interested in paying for what they see as a small program. Big companies will prefer to granulate their products as little as possible and thus limit the mixing mentioned above. More about that, and about the concept of projects in the next instalment. The compatibility demand is relatively easily fulfilled by the ways of giving (window) Displays forms and functions that approximate the older WIMP interface, as mentioned above. A new, elementary user will be able to choose from several default setups, one of which is Mac-like. But the efficiency demand is more difficult, since it pulls in opposite directions. To a complete new user, the WIMP paradigm is brilliant: all the command choices are there, ready to be explored with the mouse. Seasoned users will no doubt counterargue that too tight interface guidelines will slow down interaction - which equals speed. That is true to some extent: on the one hand, the use of common features will indisputedly be faster if they are fixed spatially; on the other, individuals have individual work patterns, and individual tool needs (at times specific also for the specific activity), and having a large degree of flexibility gives room for all that individuality and specificity. The more proficient user relies on muscle memory and proximity to enhance efficiency: the keyboard is increasingly in use, either instead or together with the mouse, for performing commands. Thus, keyboard commands need to be made flexible in the same way as Displays. And then there is the matter of rearrangeability, particularly of the menus... Program granules can be put in one of three categories: those the program has in common with all other programs, those it has in common with some other programs, and those that are unique to this specific program. Now, thinking about Photoshop, for instance, it is clear that far too many "individual application"-specific command granules need to be made movable in order to promote productivity, but I don't think the basic computer user will be using programs that are in need of such motility. A point well made by Jared White is that individual programs - not to mention the "command" or "choice" features of a webpage - are not sufficiently standardized: something as simple as font style is implemented differently in almost every program you care to look at. Demanding that a specific set of basic universally needed granules be set up in specific formations in a particular spatial relationship could alleviate this; the solution could be to distinguish between a menu set of universal commands, positioned en bloc in the menu line (which would still be a Display and thus movable - but there is a good reason why commands were put up "under the roof" originally). All other command sets that can be moved into positions where they are menus, tabbed windows, floating palettes, windows, or magnetic windows. The (non-standard) granule formations that are "set up" as menus in the menu bar, have a slight indentation round their Title, indicating that they are embedded there but can be "torn off". There would have to be a way to save a set-up, a "world. The ability to have several setups saved, and to be able to easily switch between them would be invaluable - an alternative to Virtual Desktops. Set-ups could also be "attached" to individual applications, so that opening an application would switch the entire set-up (or optionally not do so). The .mac dimension (once it is extracted from the present situation of managerial stupidity) could be brought into play. Individualized user surfaces (no longer interfaces) or "worlds" could be saved as data granules and stored on the net (a la iCal) for personal recall, no matter where the user is. Thus, there would be a group of built-in standardized "world" set-ups for beginners or "switchers", and any number of local user "world" setups on a computer. A bypasser-user could recall his personal world from the Net so that he could work in familiar (and thus efficient) surroundings during the time he is seated at that local unit. It would be the best of all worlds - but it would be yet another paradigm shift. PS- Please note that the good old file selector, as well as the new hierarchic presentation mode in OSX' Finder no longer will exist, due to the new kind of file structure used. More about this in the next BloQ. Tuesday, October 08, 2002
Not killing Classic? - Yeah, right!In spite of the clamor of protests when Apple announced the demise of OS9 boot-uppability in new Mac models introduced after December 2002, they were probably right in doing so. Apple is not blocking users from booting in OS9 in presently owned models or in models presented before January 2003 but sold after that date. All they are doing is not using manpower on writing the hardware drivers necessary for OS9 to hook into the new hardware. Since it is highly improbable that Apple come January will introduce new models in all the Mac series they sell - iBook, TiBook, eMac, iMac, PowerMac -, there will still be OS9-bootable Macs for sale well into 2003 - my guess is until September, and of course there will be new out-of-production models for sale more than a year after that. Another soothing fact is that many of those that want OS9 bootability presently own older Mac models. They are one to two years behind the power curve, and so they can swat two bugs with one down payment: they can buy refurbished or used units of the last models that have that bootability and power up compared to their present (sorry) state, and they can save a nice sum of money by not buying spanking new stuff. Since I'll probably be part of this crowd, the above has in no way any derogatory inferences! However, the rumors go that this was the second in a three-stage OSX launch: first, OS9 application development was declared dead; second, OS9 hardware compatibility is being stoned; and third, Classic will be left behind, like a huge fuel tank jettisoned to be burned up on its way down. Many OSX pundits have reassuringly pointed to the fact that most OS9 applications will be able to run in Classic mode when OS9 bootability is no longer possible. They conveniently overlook several glaring GAPs: several niche programs (Quark is but one of them!) that are not (and may never be) available for OSX demand bootupability; OSX supports a very limited range of hardware accessories - far less than OS9 does - many of which again need more direct access to the Mac's hardware than Classic can offer; older hardware, for instance a good deal of that used by musicians, actually needs software that runs on pre-OS9; and a sizable number of Macaholists cannot live without regularly playing older games that will definitely never run on OSX, and at present runs far too slowly in Classic. Oh, and don't talk to me about the large number of new games that come out in OSX versions, and how many of the best games are available for OSX: even I, a rarely-ever-gaming user, know enough about the games available in the PC world, and know about what availability OS9 offered, to be able to get a good laugh from that one! There are dozens of blockbuster games, major titles, that become available every month for PC gamers - and I am not talking about silly knock-offs either! Compared to that, Mac gaming under OSX is pitiful. One Quake does not a continent make.... The rumors about plans to cut the life of Classic short are worrying because that would alienate a huge number of users, or at least force them to refrain from upgrading their OSX (and, later, their hardware). In spite of the explosion in numbers of those lesser programs that form the undergrowth necessary for a viable computer producer ecology (lesser in terms of sales numbers, not in importance - often very specialized programs), there are still huge gaps in the range of available products. Again, many of these will not be Carbonized, and since they may be the only Mac choice in their category, exodii (isn't that the plural of exodus?) will become unavoidable. What has caused this Jeremiad? One of the most recent AppleCare Knowledge Base article unequivocally states that "Mac OS X 10.2 limits the amount of memory shared among all running Classic applications to less than 128 megabytes." Ignoring for the moment the problem of memory leaks, it is hard to imagine how any just marginally important (by which I mean professionally developed programs, and thus programs of a certain size) piece of hardware can be usefully run in Classic? And it clearly kills off the previously discussed argument about non-Carbon programs in Classic alleviating the demise of OS9 bootability. It is hard to see how this limit came to be imposed for any reason other than sales-political reasons - it wasn't there in pre 10.2 versions, so it can't be a matter of programming. Apple seems to want to force developers into Carbon- or Cocoaizing all their software within a very limited time horizon; the manoeuver cannot be a stick to the users since many will be left without vital software, and thus without any incentive to upgrade at all. The reason for this forced -izing must have to do with future hardware developments - perhaps an increased use of multiprocessing (MP), or at least a specialization of computing power on several chips? - something that is not supported in Classic. One suggestion has been that Apple wants to force Quark away from the Classic teat, but that would be..... the usual metaphor with sparrows, etc, isn't suitable in this context, but you get my drift....? Whatever the reason is, this limit presents the Open Source world with a powerful argument against Apple - and an equally strong incentive for Macoholists to look closer at Open Source software and post-feudalism. Melée, hotch-potch, and other UI terms (3)Is the "page" metaphor used in Internet browsers more intuitive than the "file/folder" metaphor? Apparently - it takes very little time for new users to understand the idea of browsing back and forward, and the necessity of knowing an address where a page can be found (though the structure of the address itself is more confusing). One explanation might be that daily work is both less and more linear than implied by the desktop metaphorics: possibilities in the latter such as program switching and temporary storage on the desktop are later extensions of the original idea and attempts to move away from an idea of computer use being like "temp typing", instead trying to make the individual user the equivalent of a full "office worker pool". There is a lot of physicality embedded in the extended metaphorics: work activity moves from device to device (typewriter, filofax, cardfile). In the "browser" metaphorics, it is the data rather than the programs that are the basic units, and it is the perceptual experience of the individual rather than the objective existence of the data that is promoted. The user doesn't read a document; he experiences an expression of activity and links it to other expressions, thus creating his own experience. The Back and Forward commands enables him to move linearly through his experience rather than from document level to document level, and thus he may retrace his step and choose a different branching out to a different part of the data universe. The experientiality rather than the physicality embedded in the "browser" metaphorics is vividly portrayed in the new use of icons. Program icons are not only individualized and (often) represent the naming rather than the functionality, they have become indicators, representing data activity progress rather than program status (activated/disactivated). Similarly, data representation has changed from being generic to sampling its experiental content - to wit, images or dtp documents. The "browser" metaphorics implies an eradication of the boundaries of "mine and thine", extending the range of accessible data to the network - in other words, virtualizing it. In a number of ways, notably iDisk and iCal, Apple has promoted this perception, but the concept of the "digital hub" counteracts such a metaphorics; here, the hierarchy is a dual-level one: iPod, mobile phones, cameras, televisions and such data carriers are ranked below the Mac (with access going via iApps), whereas the contact to other computers has the character of peer-to-peer ranking. There is a differentiation inherent in the p2p contact, insofar as zeroConf (aka RendezVous) enables problem-free local, or communal, contact and Mail and Browse handles the "wider" connectivity, but if any hierarchisation can be ascribed here, it would be the local over the global. This might or might not be based on a conscious sociocultural perception, but it is definitely tied up with the politics of the Microsoft court case: Apple distances itself from the accusations directed at MS about monolithization (especially by incorporating programs into the OS code itself) by emphasizing (tada!!!) interfacing rather than integration. The difference between incorporating services and programs into the OS is rather vague, and it might be argued that Apple was among the first to transgress by including QuickTime into MacOS.... but it is all about perception, and Apple's emphasis on easy connectivity in peer networking firmly positions it as a "good guy", at least until MS has been judged and punished. Including RendezVous into this discussion is permissible because the technology is precisely about rendering mechanics invisible and thus making the metaphorics of the UI more coherent - precisely the same argument that led Apple to develop and stick to using removable storage devices that are integrated into the metaphorics virtuality: the device doesn't exist virtually (is not shown on the Desktop) until it has established a data contact with the computer. Likewise, a virtual action (drag-n-dropping the icon into the wastebasket (one of the few metaphorical inconsistencies in OS9)) is necessary to enable the physical removal of and break the data contact with the device. For many years, the extra mechanical and electrical parts necessary for this added significantly to the cost of Apple hardware. The slot-fed CD drives signify a return to this philosophy, though sheer necessity has forced Apple to use tray-loading CD drives in new iMacs and TiBooks so far. The AppleTalk technology was based on the same concept - not just ease-of-use but metaphorical consistency - and RendezVous reestablishes it across all connectivities, since it is a meta-technology. Strategically, this is not only a stroke of marketing genius but a sheer necessity for the plethora of wireless connectivity to be introduced from Apple in the near future: it is necessary for iMacs to automatically find and contact not only peer-devices such as iPods, cell-phones and PDAs (the two latter being non-Apple products), but also mice, keyboards, loudspeakers, microphones, music keyboards, webcams and penpads (about half of which will be delivered by Apple as either standard or optional accessories to their computers - you guess which half). The vipers' nests are about to be eradicated: "no wires", as some nosy fellow once said. Though user comfort and flexibility has something to do with this, it is mainly for aesthetical reasons this paradigmatic hardware change will happen. To bring the discussion about OSX UI problematics back to where it began, the next installments of musings will be about consistent paradigmatics in menus, windows and dock(s), and about UI representation of data structuring, presentation and -manipulation (you will know what that means once I get going). Inspired by David K. Every, the installments will be constructive and suggest ways to present UI metaphorics consistently and coherently. This is not in the hope that it will influence - to say nothing about inspire - Apple's GUI experts (surely they must have some, right?), but rather to enhance awareness and promote discussion of UI matters among Apple users. For this reason, the coming installments will also be given new titles.
|
|