Expand Cut Tags

No cut tags
walbourn: (Default)
[personal profile] walbourn


Roundtable: "By the Books: Solid Software Engineering for Games" (part 2 of 3)
The focus of this roundtable was on s/w engineering methodologies, and included discussion of Scrum, test-driving programming, and extreme programming. Books exist on all these topics, and a number of websites are devoted to various methodologies for future reading.

Scrum is a suite of 'agile programming' methodologies and was recently adopted by a company represented at the roundtable. They felt it works well with small groups of 7-10 programmers, but smaller or larger teams probably would have difficultly implementing it successfully. They discussed the general process and felt it was going to work out well for them in the long-term, but they only had a month or so of experience with the approach.

Test-driven programming involves writing unit tests (even before writing the code being tested), and works best in conjunction with an automatic build process that invokes all the unit tests after each build. This approach was being used to various levels by a number of attendees at the roundtable, although the investment in good unit tests can be time-consuming. Most felt that even a small suite of unit tests was helpful for systems code, but required a larger investment of resources to implement smoke tests for game logic or complicated/dynamic systems. Tying it to automatic builds ensured that problems and invalid tests were discovered quickly before other code was developed that relied on the potentially faulty functionality. Writing the unit tests also provides an excellent review of the interfaces to a system and as a basis for refinement. Often, however, the unit test code size approached the size of the system for which it was designed to test.

Extreme programming is a suite of methodologies that involves peer programming, fast iteration development, and incorporates test-driven methodologies. While a number of people were using aspects of extreme programming, only one company represented at the roundtable was using the full technique. Peer programming was generally seen as very effective for debugging complex issues or in 'crunch' periods, but expensive and potentially wasteful for full-time usage. Having more than one developer working on the same section of code base, however, was viewed as a good thing in terms of stability and avoided having all knowledge for a specific piece of code invested in a single developer. Peer programming requires developers to be in close proximity for work and for regular (though short) meetings needed for fast iteration development.


Workshop: "Microsoft's HLSL Introductory Workshop"
Microsoft offered workshops during the conference with some hands-on use of the High Level Shader Language version 3.0 in the DX9 Summer Update 2004 Beta within the .fx effects file framework. The workshop also demonstrated the use of the integrated debugging tools for Visual Studio .NET 2003.


Keynote: "Bleeding Edge Engine Development"
John Carmack gave his first keynote at GDC this year, so it was well-attended. He is obviously not a practiced public speaker and rambled on a bit, but he did cover a lot of general technology and development trends, as well as hard problems to be solved. Generally he felt that most graphics and audio problems were 'solved' or near 'solved', but much progress is to be made in AI, physics simulation, and networking.

He also discussed the need for very large development teams on commercial products, and stated his hope that Indie development will be able to make more creative and ground-breaking games with more agile teams. Carmack pointed out that the upcoming Doom 3 is the first product at Id that he didn't write the majority of the code for-he is clearly bothered by the lack of personal control and code ownership. He felt that content development demands were taking so long as to make it nearly impossible to make use of current generation h/w fully by the time a product reached market.

Interestingly during Q/A someone asked Carmack about the next generation of consoles, and he expressed a belief that the new multi-processor designs (like that of the PS3) were proven unworkable in the past and therefore was disappointed in their returning to vogue: CPU/GPU parallelism is hard enough to get good utilization, and multiple CPU designs would likely prove very difficult to use efficiently.


Lecture: "Revisiting Standard Joint Hierarchies"
This talk discussed ways of extending the standard joint hierarchy system used for skinned characters in ways that better match true human joint structure. This is accomplished by augmenting the standard bone/joint indexing with a more complex jointmap. By doing so, one-to-many mappings can be used to keep the animation control degrees-of-freedom (DOF) low while providing many more individual articulation points. These mappings are then managed with more complex controllers, dependency enforcement, etc.

The discussion included many examples of using this approach to control a realistic human skeleton that behaves with anatomically correct behavior for side-to-side torso bending, proper arm rotation through a sinus cone, shoulder lifting through compensation, sliding bones such as the scapula, and floating-rotation centers to simulate realistic non-point joints.


Lecture: "Growing a Dedicated Tools Programming Team"
Bioware makes extensive use of a tools programming team and has done so for several generations of product, and this talk discussed many of the issues in developing and using such a tools team. There was discussion of evaluating when a dedicated-tools team would be worthwhile, how to select engineers that work well on such a team, and the kinds of ways Bioware puts such a team to use.

Generally speaking, Bioware uses the tools team for the creation and maintenance of game data editors, not to create shared engine technology. Such technology is handled by project-specific engineers and shared with the tools team for integration with data tools. The Bioware tools team is focused on tool usability, data protection, and reduction of development time for the projects. In general the pre-production stage of the project uses the most tools team resources to set the stage for the production phase, and the bulk of tools programmers are moved off to other projects once full production begins leaving some to optimize the tool-chain and resolve bugs. Lead tools engineers help chart out the product's tools needs, as well as provide expertise on what tools already exist and what is practical to develop given the schedule.

The speaker, the Bioware tools team manager, made a number of key points during this talk. First, tools teams are expensive to maintain and do not often tie directly to the product, and thus are effectively overhead. They can, however, reduce development time and improve product quality by providing much more robust tools for faster design iterations. Second, retention of tools staff is very important. This can be done by both growing the worth of the tools group to the organization, and providing a career path that remains within the tools group. Third, do not use the tools group as a place for entry-level programmers to come into the organization. This can waste a lot of time for the tools team, and keep them from acquiring needed resources to do their work. Tools team members should be recruited for their desire to write tools.

Profile

walbourn: (Default)
walbourn

March 2024

S M T W T F S
     1 2
3456789
10111213141516
171819202122 23
24252627282930
31      

Most Popular Tags

Style Credit

Page generated Jan. 18th, 2026 01:18 am
Powered by Dreamwidth Studios