Skip to main content
COVID-19 information and screening Learn how we’re keeping our campus community safe, healthy and engaged during our gradual return to campus.
Note: The university’s mandatory vaccine directive is now in effect. Learn more about vaccine requirements.
Ontario Tech acknowledges the lands and people of the Mississaugas of Scugog Island First Nation.

We are thankful to be welcome on these lands in friendship. The lands we are situated on are covered by the Williams Treaties and are the traditional territory of the Mississaugas, a branch of the greater Anishinaabeg Nation, including Algonquin, Ojibway, Odawa and Pottawatomi. These lands remain home to many Indigenous nations and peoples.

We acknowledge this land out of respect for the Indigenous nations who have cared for Turtle Island, also called North America, from before the arrival of settler peoples until this day. Most importantly, we acknowledge that the history of these lands has been tainted by poor treatment and a lack of friendship with the First Nations who call them home.

This history is something we are all affected by because we are all treaty people in Canada. We all have a shared history to reflect on, and each of us is affected by this history in different ways. Our past defines our present, but if we move forward as friends and allies, then it does not have to define our future.

Learn more about Indigenous Education and Cultural Services

September 22, 2010

Speaker: Dr. Martin de Lasa, Autodesk Canada

Title: Feature-Based Controllers for Physics-Based Character Animation

Abstract: Creating controllers for physics-based animation of characters is a long-standing open problem in animation and robotics. Such controllers would have numerous applications while also yielding insight into human motion. However, creating controllers remains very difficult: current approaches are either constrained to track motion capture data, are not robust, or provide limited control over style.

In this talk, I'll present an approach to control of physics-based characters based on high-level features of human movement, such as center-of-mass, angular momentum, and end-effector motion. Objective terms are used to control each feature, and are combined via optimization. Using this approach, locomotion can be expressed in terms of a small number of features that control balance and end-effectors. This approach is used to build novel controllers for human balancing, standing jump, walking, and jogging.

These controllers provide numerous benefits: human-like qualities such as arm-swing, heel-off, and hip-shoulder counter-rotation emerge automatically during walking; controllers are robust to changes in body parameters during movement; control parameters and goals may be modified during run-time, during control; control parameters apply to intuitive properties such as center-of-mass height; and controller may be mapped onto entirely new bipeds with different topology and mass distribution, without modifications to the controller itself. Transitions between multiple types of gaits, including walking, jumping, and jogging, emerge automatically. Controllers can traverse challenging terrain while following high-level user commands at interactive rates. This approach uses no motion capture or offline optimization process.

Biolgraphy: Martin de Lasa is a Senior Software Developer at Autodesk and part of the Media and Entertainment Division's Animation Solutions Team. Prior to joining Autodesk, he completed doctoral studies in Computer Science at the University of Toronto (2010). He also holds a MESc in Electrical Engineering from McGill University (2000), a BESc in Mechanical Engineering from the University of Western Ontario (UWO) (1998), and a BSc in Computer Science (1998) from UWO. Prior to his doctoral work he was Technical Lead, Architect, and Manager, at Boston Dynamics, where he played a leading role in human simulation and robotics projects, including: Digital Biomechanics, BigDog and LittleDog. His research interests span the areas of computer graphics and robotics, focusing on leveraging optimal control, optimization, and machine learning methods to build physically-based models of motion to ease animation/control of simulated characters and robotic systems.