Change Displayed Text SizeGrow Displayed Text SizeShrink Displayed Text Size


Saturday, February 07, 2004

The sum of all salsa

"What are you drinking?"..... a glance at someone else's glass... "Water?"
I hold up the little dish. "Salsa!"

After the next toast someone commented that I had "man-juice" on my shirt from slorped salsa.

You gotta get your veggies in somehow, right?

Someone recently said "The thing about Dan is, when you think he's kidding he's being serious, and when you think he's serious, he's kidding!". That might sum me up in one sentence right there I suppose.

2/07/2004 10:52:29 PM ] [  ]


Rethinking the 6-sided box

The driving problem
The real problem with 3D modeling applications of the time was that none had really given much thought to the artists using them. Using a 3D modeling package to do anything more than create a picture of a shiny ball required a high degree of specialized training in that particular application. To create complex shapes required a very complex mind that could think several steps ahead - and still often resulted in failure. Artists rarely think that far ahead when creating. I saw a need for an environment that fostered more direct manipulation of geometry.
A good example here would be adding a texture to an object. A texture is an image that's applied to the geometry of an object - like a decal. Then-current 3D applications required you to switch to Adobe Photoshop, create an image of your texture there, and come back to the 3D environment. There you would attempt to fit your image to the shape of the object - no small task. Often there was no way in hell that the texture would line up correctly with the object.
There was no reason you couldn't paint directly on your object other than the existing paradigm had not been broken.
The goal of Event Horizon was to break that barrier and provide tools that artists could use.

A Starting Point
To create an application that allowed direct manipulation of 3D geometry was arguably beyond the technology of 1996. At the time, hardware accelerators were just beginning to come to desktop computers. ATI was just beginning to ship the Rage/Rage II chipsets, and there wasn't much software out there to support hardware acceleration. OpenGL existed of course, but was largely unknown in the desktop world - only high end workstations like SGI hardware used OpenGL.
3D applications of the day were very limited. Infini-D, RayDream, Strata, and others were on their surface all the same. They supported a limited number of primitives - box, sphere, cone, cylinder - and a few spline-based tools like a lathe tool. All of those applications used software rendering, not a hardware accelerator card, and all of them relied on a 3+1 view of the modelling scene to present a 3D world to the viewer. The work area was divided into 4 panes, each viewing the 3D scene from a different perspective. One was a top view, another a side view, another front, and the last usually a "camera" view or orthagonal projection. It was left to the user's brain to synthesize 3 dimensions from 4 different views. This lead to a process of construction that was tedious and frustrating. Often this was done in wireframe, with no depth information. The user would never be certain that the surface they were editing was on the front or the back of the object, leading to many lost hours and lost hair.

Soft Shadows The allure of radiosity
One rendering technique that was in the spotlight was radiosity. Conventional ray tracing did not handle soft shadows and shading at all, which is why man computer rendered images look very artificial. Radiosity could be used to complement ray tracing by computing the soft shadows and altering the actual geometry of the scene. So in places of soft shadow, the geometry was broken up into ever smaller triangles, each with different shading values which when rendered would give you nice soft shadows.

Of course, with each surface of the scene's geometry broken up into many more triangles, you were giving the rendering pipeline much more work to do - more triangles to process. Some games got around this problem by computing the radiosity solution, pre-rendering the scene and using that as a texture on the original, simple geometry.

The rendering pipeline that Event Horizon used supported very, very high numbers of triangles for it's time. In fact, more than I knew what to do with. So not only could it support the high number of triangles a radiosity-cooked scene, but in many cases it could in real time. At the time, radiosity and other soft shadow systems broke a barrier in the suspension of disbelief for users - with both hard (ray tracing) and soft (radiosity) shadows covered, it was far easier for users to believe that a scene was a photograph, for instance, than an artificial environment.

In the end, after a number of experiments variously named "Sunburn" and "Shake n Bake", radiosity was only one technique that was used to solve soft shadows. Some of the experiments made for dramatic demos though - things like moving lightsources producing soft shadows impressed programmers. But it wasn't convincing enough for the uninitiated. It still came out looking very sterile and artificial, like a photo out of a magazine, a bad movie, or Martha Stewart. In many ways it was too perfect.

Spatial Audio SoundSprocket
Using the Apple GameSprockets API was one of the best decisions I had made. Besides the InputSprocket layer which gave me access to consumer game input devices, the real gem of the GameSprockets was SoundSprocket. SoundSprocket was a 3D audio library that was very well thought out and implemented - the data structures for working with SoundSprocket were very similar to those used for QuickDraw3D, which made the two very easy to use together. OpenAL and OpenGL, on the other hand, are often speaking in two very different dialects of the same tongue, which makes them difficult to use together.
You would think that the designers of those two APIs would assume developers would want to use OpenAL to attach sound to OpenGL scenes, but I guess not.

At any rate, SoundSprocket let me attach sound to 3D objects, and served as a 3D audio renderer. You could pick a point in 3D space and put your sound there, or link the position of a QuickDraw3D object with a sound source. Very cool, and very useful. Spatial audio was a huge advantage for users. If you've played a game with 3D audio you can understand - in a game you can hear things sneaking up behind you, rockets whizzing by. In Event Horizon, your objects and actions had sounds attached to them that travelled in 3D space.

QuickTime provided a built in synthesizer that was adequate for generating some kinds of sounds and tones on the fly (it was better with musical instruments than with raw FM tones). Generating a sound was not a big deal, even though it could be tricky to generate a desired sound - it would often come out sounding like static instead of what you wanted. But that meant that the program could create sounds from nothing - or from the parameters of the environment itself.

Audio Shaders Sound as Texture
I had read an interesting paper on "Sound As Texture" a few months before this point, and hadn't thought much of it at the time. I went searching for the article again and didn't find it until long after I had implemented audio shaders, but what I ended up with differed significantly from what the authors had described anyway, so I guess this was a good thing.
Objects in 3D environments have properties attached to them that define their appearance, called shaders. The hair on Sully in Monsters, Inc. was a complex shader, the pattern on Nemo's scales in Finding Nemo was a simpler shader. Shaders can be very simple, like setting a color and a few things like how reflective or transparent your object is, or they can be complex and define how a shader can respond to the environment and the object it is attached to (very convincing clouds are often well written complex shaders, for example). So a shader can be as simple as a color, or it can be as involved as being a programming language in itself. The most famous language for writing shaders came from Pixar in the late 1980s and is what they and the rest of the world still use today, Renderman.
Whole programs can be written in a shader language. So the idea here was to have shaders that could produce audio - primarily so that when two objects interacted in a scene, their shaders would create the appropriate sounds. A large, hollow object would make a different sound than a small, solid one when touched or struck - and the sounds would be governed by the shaders and objects interacting with each other.
In practice, more often than not you got rasping noises. I couldn't pin down wether it was a synthesizer problem (I wasn't as good with sound programming at that level as I thought I was) or a shader problem (the system may not have thought that objects were hitting each other as hard as it should have). When it worked though, it worked well. Most of the time, however, you'd want to use a prerecorded sound rather than a programmed shader since the results with shaders were unpredictable. There was code to support mixing the two - like putting distortion or a "whammy bar" on a recorded sound in memory inside a shader. That was designed to support some user interaction features that never got implemented. Unfortunately, when a user shaped geometry with their "hands", the system couldn't quite make squeaks like making a balloon animal.

2/07/2004 01:03:00 AM ] [  ]


Wednesday, February 04, 2004

Back to the Back to the Back to the BeBox

Finally I got my dual processor BeOS machine at home up and working. Turns out that the dead PRAM battery in it was actually no PRAM battery at all, which once fixed improved things considerably.

I managed to port some things to PowerPC that previously were Intel only, nothing particularly interesting but useful nonetheless. I may end up putting them up on [BeBits].

To give you some history, in 1996 Be allied itself with the Mac clone makers to offer an alternative to the MacOS on Mac-compatible hardware. That built some momentum, and while Mac users were very taken with Be, it was mostly to position BeOS to be acquired or licensed by Apple as the foundation of their new OS strategy. Apple ended up buying NeXT instead, and Be realigned itself around Intel hardware - which grew Be's userbase even more. Things went downhill for Be from there, not long before they were acquired by Palm they realigned again around a "media device" strategy - set top boxes, internet access terminals and the like, the same space that QNX had carved out for itself.

Thus, most of the interesting things written for BeOS happened while Intel hardware was in vogue, and the PowerPC users were largely neglected. While I have an Intel machine that runs BeOS, it's a rather weak PI 166, which even with BeOS is a bit sluggish (hey, it the peecee was bought to be a router anyway). My Dual 200 PowerMac makes an excellent BeOS machine, but I end up either having to port things if I can, or I just have to do without some things. It doesn't help much that the PowerPC development tools are based around the somewhat eclectic [Metrowerks] command-line tools, while the Intel tools are based around the more standard [GCC]. What will build under GCC will take quite a bit of work to get going under the Metrowerks mwcc compiler.

2/04/2004 06:33:55 PM ] [  ]


Tuesday, February 03, 2004


Dear Friends
Don't let me make excuses for the past year. I let us grow distant and out of touch. Your world moved and I stood still. I can't make up for the lost time, or the lost times, I can't make enough amends for not being there... but I can try. You know where I was, and what I was doing. I know now how much I neglected the people that really mattered.
I'm sorry.

Dear LittleBear
Still my best friend. I should have been there when you were going through NA, I should have been there for everything. I should be there more for you now, but I get the feeling you're keeping me at arm's length.
I understand. I've been there.

Dear Jill
Through it all, you were my reality check, though I didn't always appreciate it, and I didn't listen as well as I should have. You're one of the few true friends this geek has, and you were there for me when I needed you. I just wish I knew how to pay you back, how to make you realize how important that was to me.
And no Marines. Ever.

Dear R.T.
Yes, you. I screwed up. If we had taken the time to talk more, you'd have a better understanding of why, but it doesn't really matter. I made a mistake and could have handled the situation better. I don't even know if you remember what the situation was. Doesn't matter. I wish things had worked out differently, since you're a good person, and a good person for me to be around... but there's nothing I can do about that now.
And yes, I sorted out the magazines.

Dear GKT
Yes, I can still call you GKT. The whole "A" thing just doesn't sit well with me.
I'm not being too hard on myself. Honest. And I've always been a little too hard on you, I held you to my own standards, which was unfair.
I'm glad we still talk. There's a lot left to say.

Dear Self
You've been an asshole, especially so recently. You were an ass to yourself for most, if not all, of 2003. Since then you've screwed up even more. You pushed away people who weren't about to cut your heart out.
I know you've been burned, and burned badly... don't spend the rest of your life afraid of fire.
Stop it.

2/03/2004 12:25:31 AM ] [  ]


Monday, February 02, 2004

Isn't that thoughtful

The Java security API manages certificates and signing much like the MacOS keychain does, and thankfully the two interoperate since the [MacOS X JDK 1.4.1 update]:
Native support for the security features is provided through the Mac OS X Secure Transport API. This allows users to manage certificates through their Keychain and the Keychain Access application.

Which is nice, since I was all about to test and implement it myself! This means that with Javamail and the [bouncycastle] S/MIME provider, signatures and crypto should be about as transparent to the user as they are with

2/02/2004 04:57:50 PM ] [  ]



As [crandall] has pointed out, Impossible Recording Machine is the first Positron artist on the iTunes Music Store - meaning you can redeem those Pepsi points for some good music. One of my gripes with the Music Store when it launched was that a lot of the artists I listen to are on very small labels, like BileStyle (now part of Underground, Inc) and Positron. Apple so far has surprised me with how far out of their way they have gone to include small labels in the ITMS, within a few months I could easily see myself buying most of my music there as more small labels are added.
Anyway, Chris didn't include a direct link to Impossible Recording Machine's listing in iTunes, so here it is:


2/02/2004 03:46:18 PM ] [  ]


Sunday, February 01, 2004

Standing on your head before bed

"What are you doing Dan?"
My roomate, Bill, was picking cut blades of grass out of his hackey sack before he brushed his teeth for bed. He looked me over nonchalantly at first, then with increasing curiosity.
"Trying to fall asleep standing on my head." which was easier said than done. One I was more or less on my head, it got harder and harder to relax by the second.
"Why are you doing that though?"
"I'm going to try something new every day. I don't think this is going to work, but at least I tried, right?"

When people ask me "Why are you learning to fire dance?" my response is often "If you could, wouldn't you?". It's surprising how many people have told me they wouldn't.

People continue to astound me.

2/01/2004 04:26:03 PM ] [  ]