IM Forum for 4/9/14: Experiments, Prototypes, Previews

Speakers: IMGD 2nd Year MFA Students
Time: Wednesday, April 9, 4-5:50pm
Location: USC’s School of Cinematic Arts Interactive Media Building (SCI), Room 206

Please join us for a special event this Wednesday April 9 at 4:00-6:00PM in SCI 206 when the second year MFA students will be presenting their experiments, prototypes and previews of next year’s IMGD Thesis projects. This will be a dynamic, hands-on opportunity to experience and critique a variety of interactive experiences at a crucial moment in their development. Don’t miss this preview of things to come!

Explore DevBlog #12: Winteract Approacheth

Accomplished (Lately)

  • *  Fixed movement bugs (again)
  • *  Implemented easy-change system for defenders to flip between defending and normal states
  • *  Implemented first pass on sound system for defenders
  • *  Checked off most of the “nitpick” fixes from my DevBlog #10 list for the Defender character
  • *  Had music meeting and got some first-pass tracks
  • *  Found out you can’t get access to the sound library if you’re not a student, even for student projects
  • *  Researched sound effects and audio in general
  • *  E-mailed an animator

 

Next Steps

  • *  Complete movement code for character 3 (post-show)
  • *  Write AI for character 3 NPCs (post-show)
  • *  Finish the character 2 “nitpick” fixes
  • *  Make it so that fleeing characters don’t have to go all the way to their checkpoint before leaving the territory
  • *  Get final tracks from my musicians
  • *  Get final sounds from my sound designer
  • *  Get an animator on board for semester 2
  • *  Get ribbonfish model from 3D modeler
  • *  Make list of additional models for modeler to work on
  • *  Implement AIs dying / respawning
  • *  Turn off red auras on Antagonists for character 2
  • *  Figure out the “camera moves on collision” bug for character 2 (probably have to talk to Asher or someone about that)
  • *  Talk to Mike about implementing a music system similar to “Dear Moon”

 

As you can see, work continues to proceed/pile up in equal measure. I started to need this list format again to keep everything straight in my mind, so expect further updates over the next week and a half. The winter show is a week from this Friday, so we’re counting down…

Explore DevBlog #10: A Cue Queue

So I’m not going to do my usual “have done” and “need to do” list, even though the need-to-do things are all still there (plus I also need to add that I need to send some mood music to my composers), because we had thesis class today and I decided that what I really need to do before I go further is just all the polish stuff that will make the game in its current state. There were some things I’ve been letting slip by because I considered them polish and wanted to get the core done first. But it’s becoming apparent that there’s “polish” and then there’s “we can’t tell what’s going on in your game because there are no cues.” There’s, like, a million different little things that need to happen here, so I’m going to try and list as many of them as I can think of here and hope I’ll have time to get through at least most of them. (The title of this post is misleading, by the way; these are in no particular order.)

 

Defender Character

  • Make the “scent trails” better / turn them into particle emitters
  • Add real custom particles rather than temp particles for scent
  • Make it so you leave scent
  • Make faint scent trail leading up to food
  • Make it so you move the same speed as the others of your kind
  • Give the static pathers scent like the moving ones
  • Make all static pathers descended from prefabs
  • Make a noticeably different visual state for defensive mode (AI chars)
  • Add sounds from other critters
  • Make it so you see your own antennae in creature 1 (This still needs to be better)
  • Add eating as a thing you have to do to finish that goal
  • Make it so the predators chase you more often than others
  • Fix camera clipping for character 1
  • Do a proper texture-swap for the walls (Ended up swapping out barrier completely for terrain)
  • Implement a controller
  • Implement special move feeling on controller
  • Add damage cues
  • Add dying (self and NPCs)
  • Get rid of or lower volume on sand movement sound
  • Request “bumping-into” sound?
  • Reduce speed of self and AI, and decrease area size
  • Fix AIs getting stuck

 

Fast Character

(Note: This seems like fewer things only because I understand this character a bit less and he’s not as fleshed-out as character 1. I’m sure there’s just as many things that need to be done, but I just don’t know what they are yet.)

  • Make it so you can’t jump over walls
  • Make fleeing characters move faster
  • Make fleeing characters not move all the way to target before leaving territory
  • Add sound / music cues
  • Change “predators” to look more like territorials, remove their sensory sphere renderer
  • Make particles show up for water
  • Tweak movement to feel better
  • Add damage cues
  • Add dying (self and NPCs)
  • Make AIs leap out of water
  • Add bubble trail on PC and NPCs to help indicate sameness
  • Implement biting/defense of static pathers

 

General / Environment

  • Make characters sway in the waves
  • Replace fakey barriers with more environment-appropriate barriers that encourage crossing as char 2

 

So that’s it for now. I might edit this entry to add to my lists as I think of more things. (Perhaps I’ll even cross out the ones that I accomplish. That might feel nice.)

Quicksilver Developer’s Blog

Hi all,

I’ve started a dev blog for my thesis project, Quicksilver: Infinite Story, in hopes of explaining what it’s about and sharing some things that might be useful.  So far I’ve written one post explaining what Quicksilver is, and another about computational storytelling.  I’d appreciate any feedback about the direction or style of the dev blog – what would you like to see out of it?  For the time being, I’m planning on updating every other Tuesday.

Quicksilver: Infinite Story is an action RPG inspired by animated adventure television shows, fusing the genre’s tropes and conventions into an unprecedented story generation engine to create unique new adventures every time you play.  This past year, some classmates and I developed a prototype as part of the Advanced Game Project collaborations between the IMD and USC’s Gamepipe lab.  For my thesis, I’m trying to reinvent the gameplay to mesh with the story generation system and take advantage of all the opportunities afforded by procedural narrative.   Below, I’ve cross-posted the latest blog entry, about how to get a computer to write a story.

(more…)

SurroundScapeMixer visuals…

So, it’s been a fun ride with thesis thus far. And right now, I’m REALLY angry at myself. 🙂

(Don’t let the smiley face fool you!)

I’ve succumbed to the physical device metaphor! ARGH!

After much user testing, the thing has a more interesting interface that improves upon the draft interface in a number of ways…

1) Users drag an icon on to the channels to represent it.

2) Channels with the same icon are moved simultaneously.

3) The Tablet now shows which channels are recording automation (rather than having to look at the sequencer).

4) Much faster screen refresh now…

I really like it! I’m just annoyed that it came to it! Not just any physical metaphor! Frigging METALLIC BLUE.

Surround mixing hip hop music…

…it kind of worked!

So last Friday, I had some fun with some of my cousin’s material, remixing it into surround. Nothing big…mostly the draft stuff he had on his Pro Tools rig…

Lesson 1) Pro Tools…ARGH!

So while both of us can use Pro Tools, we’re not really experts, so it took us a while to figure out how to get the surface controller mapping working…and once we did, we didn’t quite get it working the way we wanted.

We ended up taking the easy way out…exporting out OMF files so that I can use them in Sonar. Frustrating.

Lesson 2) More LOOK, Less numbers.

So right now, the tablet looks something like this:

surface.gif

Which I’ve now realized is, while extremely functional, doesn’t give a lot of information. Or, rather, it gives a lot of bleh information. We could move things around really well and we rather quickly came up with sound placement that he was very happy with, but visually he was hoping for something more “shape-y”

And I think I see what he wants. Gonna spend the next few days playing around with some visuals. I know what I want to get on the screen but sort of hard to describe. The image that I have in my mind is a sphere in Maya in vertex edit mode, so we’re poking into the sphere or pinching and pulling out and stuff like that.

So yeah. More solid shape, less points.

And screw the angle/focus/db numbers too!

Persuasive Games: Games Phone Home – Darfur is Dying

Ian Bogost’s write-up on “Darfur is Dying” for Serious Games Source argues some key similarities and differences with The Legend of Zelda: Wind Waker, Ico, E.T. the Extra-Terrestrial, and his very own Disaffected!

sgs_logo.gif

some excerpts:

One of the unique properties of video games is their ability to put us in someone else’s shoes. But most of the time, those shoes are bigger than our own. When we play video games, we are like children clopping around in their parent’s loafers or pumps, imagining what it would be like to see over the kitchen counter. As I argued in my last column, this trend corresponds with video games’ tendency to fulfill power fantasies. Video games let us wield deadly weapons. They let us wage intergalactic war. They let us take a shot on goal in the World Cup final. They let us build cities, and then they let us destroy them.

Darfur is Dying, created by USC graduate Susana Ruiz as part of her MFA thesis, is a game that breaks this tradition. In one part of the game, the player takes the role of a Darfuri child who ventures out of the village to a well to retrieve water for his family.

In Darfur, weakness is all the player ever gets. There is no magic to invoke, no heroic lineage to appeal to; strength adequate to survive is simply inaccessible.

I have numerous objections to the way Darfur is Dying represents the current political situation in the Sudan, most of which relate to how the game (and really, most American media rhetoric about the region) ignores the historical and political context for the current violence. But the game’s water foraging dynamic offers an important lesson for designers of serious games. If such games are meant, at least in part, to foster empathy for terrible real-world situations in which the players fortunate enough to play video games might intervene, then those games would do well to invite us to step into the smaller, more uncomfortable shoes of the downtrodden rather than the larger, more well-heeled shoes of the powerful.

Perhaps in 1982 the world was not ready for a video game about the loneliness and frailty of an extraterrestrial. But, oddly, we were ready for a film about it. E.T.’s role in the video game crash of ’83 may or may not be overemphasized, but certainly we have used its failure as part of an ongoing excuse to represent only power, and never weakness in video games. Critics might argue that frail situations are not fun. They might argue that feeble characters do not wear shoes anyone wants to wear. And that may be true. But when it comes to the world we inhabit today, isn’t it the vulnerable— like E.T., or more strongly, like the Darfuri—who demand our empathy?

Again, complete article here

An addendum…

So a few people have reacted quite negatively to the quote from my previous post. I guess my choice of words wasn’t that great…

I think the thing to note is that: no matter how awesome an engineer is at the mixing board or how quickly a person can fly around the console view of a PC sequencer, he mixes through a method of controllying multiple one dimensional values. With his given hands he can only shift one of these values at a time. If he’s GOOD, he can shift two of those values at a time.

Spatiality, however, cannot be defined by a single one dimensional pot or fader.

Sure, the old paradigm of the stereo line worked fine with the pot and fader combo: a quick twist to put it somewhere between full left and full right, a quick push to place the volume.

But in the surround sound world that just doesn’t fly. Things like how focused or wide the sound is, or distribution of rear channels, or ratio of front to rear mix. And there are other things like reverb and EQ that also affect the perceived directionality of the sound.

spanner.gifI did recently buy a high end audio sequencer; I upgraded my old Sonar 3 to the hip and cool Sonar 5 Producer Edition. Among the things that it has is an “intuitive surround panner” that “appears on any surround channel or bus, with no need for patching.” It’s not bad, and it allows for automation too… but for a top-of-the-line professional tool, it feels remarkably amateurish…how about being able to visualize sounds on top of each other? If you want to see sounds relative to others in surround space, your only option is to look at the respective surround panner circles side by side. Geez!

There must be a better way… a way that is not only faster and more intuitive, but also perfectly complementary to existing workflows as well as inspiring to manipulate and behold.

I think I’ve got an idea.