Backchannel for IMD Seminar 11/11/09

‹Backchannel 11/›

11/11/2009 18:20:32 ‹Seminar›

11/11/2009 18:21:22 ‹Mike›

11/11/2009 18:21:50 ‹Seminar›

11/11/2009 18:23:08 ‹Mike›

11/11/2009 18:24:39 ‹Jade› observe the homo sapiens rex as it devours its prey. The chimp raptors will have to wait their turn.

11/11/2009 18:25:26 ‹ndef› What is the motivation for designing systems as non-humans or for non-humans?

11/11/2009 18:26:14 ‹caravaggio› i don’t think its possible to design from a non human POV

11/11/2009 18:26:24 ‹caravaggio› unless you ask a non human to make choices

11/11/2009 18:28:10 ‹caravaggio› don’t all turing complete languages have the same capabilities?

11/11/2009 18:29:14 ‹ndef› Yeah, but they lend themselves to different things. I think it’s less a matter of possibilities and more a matter of how you tend to work in concert with the tool.

11/11/2009 18:29:46 ‹caravaggio› i hear that lisp is designed for non humans

11/11/2009 18:30:07 ‹KylaG› Lisp was designed for clams and clam-shaped beings.

11/11/2009 18:31:40 ‹Seminar›

11/11/2009 18:33:19 ‹MAnnetta› Much like Game Hats!

11/11/2009 18:33:28 ‹Mike›

11/11/2009 18:38:28 ‹ndef› Were those theatrical spaces designed for humans or not?

11/11/2009 18:38:38 ‹ndef› I don’t think I have a sense of what that term means.

11/11/2009 18:38:52 ‹caravaggio› i think the idea was to make humans feel like something other than human

11/11/2009 18:40:27 ‹Seminar› _speakers.php

11/11/2009 18:40:53 ‹Seminar›

11/11/2009 18:44:06 ‹KylaG› I feel like these projects would be good to look at when considering the relationship between immersion and fun.

11/11/2009 18:44:47 ‹ndef› What do you think we gain when we strip away models of user intention and user state?

11/11/2009 18:46:13 ‹caravaggio› can you do that? users will always intend something, the system may be able to change the possibility space of those intentions

11/11/2009 18:46:59 ‹KylaG› Is exploration an intention?

11/11/2009 18:47:19 ‹Mike›,en/

11/11/2009 18:47:21 ‹Seminar›

11/11/2009 18:47:34 ‹caravaggio› any concious motivation for action is intention isn’t it?

11/11/2009 18:47:54 ‹ndef› I think he’s talking about “non-syntactic” systems that map input to output in a continuous way, rather than privilaging some behaviors.

11/11/2009 18:48:05 ‹ndef› Is that what he’s talking about?

11/11/2009 18:48:35 ‹caravaggio› if it is possible for the user to understand the relationship between their behavior and the systems response they can express intention

11/11/2009 18:48:38 ‹KylaG› I feel like “intention” gives the impression that you’re trying to do something in particular, whereas exploration is just trying things to see what the response is.

11/11/2009 18:48:51 ‹KylaG› But that’s a fairly subjective definition, I suppose.

11/11/2009 18:49:54 ‹KylaG› So what he’s talking about, then, is the transition in the human mind between exploration and intention.

11/11/2009 18:49:59 ‹KylaG› ?

11/11/2009 18:50:17 ‹caravaggio› isn’t exploration predicated on the desire to know what something is like?

11/11/2009 18:51:04 ‹ndef› Upon encountering an unfamiliar system, the natural response (human response?) is to explore, in order to build a cognitive model of the system. Yes?

11/11/2009 18:51:30 ‹caravaggio› yes, but all of our behavior is ‘natural’

11/11/2009 18:52:11 ‹ndef› Yeah. I’m trying to relate this back to his idea of non-humanity, which I still don’t totally understand.

11/11/2009 18:52:16 ‹MAnnetta› But isn’t the human response to buld that cognitive model off of older, established models?

11/11/2009 18:52:41 ‹ndef› Yeah, probably.

11/11/2009 18:53:06 ‹caravaggio› i guess the question is can we be something other than what we are?

11/11/2009 18:53:16 ‹caravaggio› and then are we ourselves?

11/11/2009 18:53:41 ‹MAnnetta› It goes back to the definition of “what we are” so we can see “what we are not”

11/11/2009 18:54:01 ‹caravaggio› right

11/11/2009 18:54:14 ‹caravaggio› it goes without saying that we can’t be anything that is outside of what human is

11/11/2009 18:54:18 ‹caravaggio› because we are human

11/11/2009 18:54:34 ‹caravaggio› unless we cease to be human in a substantial way

11/11/2009 18:54:46 ‹KylaG› I was thinking maybe it was the difference between working with something humans understand cognitively, versus working with things we feel intuitively?

11/11/2009 18:54:47 ‹caravaggio› at which point we would be something that used to be human that may remember that POV

11/11/2009 18:55:08 ‹KylaG› “Human” vs. “The World” in the sense of “cognitive” vs. “emotive”?

11/11/2009 18:55:32 ‹KylaG› But I could also be completely wrong.

11/11/2009 18:55:39 ‹caravaggio› that sounds like different aspects of human existance

11/11/2009 18:55:40 ‹ndef› Based on the theatrical stuff that we’re seeing, however, this work seems to be all about building cognitive models.

11/11/2009 18:56:21 ‹caravaggio› if we can build a model of any sort of process or system in our head, its functioning is a part of our cognition

11/11/2009 18:56:53 ‹caravaggio› i think what he’s talking about neccessarily involves building systems that are beyond our understanding

11/11/2009 18:57:23 ‹ndef› What does that get us, as designers?

11/11/2009 19:01:04 ‹KylaG› Well, it gives us an interesting understanding of human cognition.

11/11/2009 19:01:20 ‹KylaG› That lets us understand how people will experience and learn about our games and systems.

11/11/2009 19:01:23 ‹ndef› 1. Live, face to face experience; 2. Eliminate language; 3. Blur the line between actor and spectator.

11/11/2009 19:02:21 ‹ndef› What exactly is the difference between rich and complicated?

11/11/2009 19:02:49 ‹caravaggio› rich indicates quality and complicated indicates complexity

11/11/2009 19:03:04 ‹ndef› Quality according to what measure?

11/11/2009 19:03:17 ‹KylaG› So it sounds like the “non-human” aspect here is the reactive environment.

11/11/2009 19:03:19 ‹caravaggio› that is not indicated

11/11/2009 19:03:41 ‹caravaggio› people still designed the reactive environment

11/11/2009 19:04:04 ‹caravaggio› doesn’t that make it human?

11/11/2009 19:04:37 ‹ndef› And design it specifically so that it will be properly reactive to humans, even if it isn’t deliberately model-based.

11/11/2009 19:05:23 ‹KylaG› No, you just design it so it will be reactive in general.

11/11/2009 19:05:33 ‹KylaG› It would be just as reactive if a bird flew through it, say.

11/11/2009 19:05:45 ‹ndef› I don’t think it would.

11/11/2009 19:05:59 ‹caravaggio› all of these are designed for humans to use them though

11/11/2009 19:06:02 ‹KylaG› Well, certain aspects. I’m thinking in terms of his “water” example.

11/11/2009 19:06:13 ‹KylaG› Not necessarily the stuff of his that he’s shown.

11/11/2009 19:06:26 ‹KylaG› It may be one of those theoretical asymptotes that we can’t actually reach.

11/11/2009 19:06:32 ‹ndef› Hm.

11/11/2009 19:21:44 ‹Mike›

11/11/2009 19:21:45 ‹Seminar›

11/11/2009 19:28:41 ‹ndef› “What is geometric performance?”

11/11/2009 19:37:10 ‹ndef› (That question was based on bad parsing… “the performance of computational geometry” was intended.)

11/11/2009 19:41:44 ‹ndef› Shallow semantics.

11/11/2009 19:41:54 ‹Bill› Can anyone whip up a retrospective reading list for this seminar?

11/11/2009 19:42:51 ‹Bill› I feel like my brain is leaking tasty concepts.

11/11/2009 19:45:00 ‹Bill› “User” as an abstraction.

11/11/2009 19:47:09 ‹Bill› Quasi-physics and playing against expectations.

11/11/2009 19:50:08 ‹Seminar›

11/11/2009 19:50:38 ‹Seminar› +

11/11/2009 19:50:49 ‹Seminar›

11/11/2009 20:02:01 ‹Bill› It’s not like training dolphins.

11/11/2009 20:03:24 ‹KylaG› If it’s rewarding play, then it’s like training dolphins.

Leave a Reply