Motion Matching (#7232)

commit 0c873fc67f0250ec7155e1999c34930b23c7647f
Author: Benjamin Jillich <jillich@amazon.com>
Date:   Wed Jan 26 10:05:07 2022 +0100

    Motion Matching: Automated tests for the feature matrix and feature schema (#38)

    * Set up base motion matching test fixture
    * Automated tests for feature matrix
    * Automated tests for feature schema

    Signed-off-by: Benjamin Jillich <jillich@amazon.com>

commit cbd4f124481faf4c8bf0b98ae4602140f231e2fb
Author: Benjamin Jillich <jillich@amazon.com>
Date:   Tue Jan 25 09:59:32 2022 +0100

    Motion Matching: User adjustable algorithm via UI and new residual calculation option (#37)

    * Added the feature properties to the edit context. As all the preparation work has been done, this finalizes the adjustable algorithm UI work.
    * Added a new residual calculation type for calculating the differences between the input query calues and the features extracted from the motion database. Either the differences are just used as an absolute value or they are squared. "Use 'Squared' in case minimal differences should be ignored and larger differences should overweight others. Use 'Absolute' for linear differences and don't want the mentioned effect."
    * Added comments and edit context data element descriptions about the feature properties.

    Signed-off-by: Benjamin Jillich <jillich@amazon.com>

commit ddad62b994be1e89cc394b28863faa44650c8ed1
Author: Benjamin Jillich <jillich@amazon.com>
Date:   Fri Jan 21 09:37:55 2022 +0100

    Motion Matching: Replaced the MotionMatchEventData with a DiscardFrameEventData and a TagEventData (#36)

    * Replaced the MotionMatchEventData with a DiscardFrameEventData and a TagEventData
    * Moved the system components and modules into the EMFX::MotionMatching namespace
    * Converted all the animation assetinfos to use the new discard frame motion event.
    * A few other fixes and code cleaning.

    Signed-off-by: Benjamin Jillich <jillich@amazon.com>

commit f022ab4f3adbe5dc141998bc368a8ca9290698be
Author: Benjamin Jillich <jillich@amazon.com>
Date:   Thu Jan 20 07:29:01 2022 +0100

    Motion Matching: Refactoring [Part 4] Introduced MotionMatchingData (#35)

    * Renamed MotionMatchingConfig into MotionMatchingData while removing the feature schema from it as that needs to be part of the anim graph node in order to be reflected by the edit context so that users can change the features used in the algorithm.
    * Default feature schema is applied in the anim graph node in case none got de-serialized along with the node (e.g. when creating a new motion matching node).
    * Added a few class descriptions.
    * Fixed the lowest cost search frequency, which was used as the inverse (the time interval) while the UI and variable was named frequency.
    * Removed the hard-coded weighting for the past and future trajectory and created new members for the cost factors in the trajectory feature.

    Signed-off-by: Benjamin Jillich <jillich@amazon.com>

commit e05b96b0eb734cd4818e343d5b46dbb54712faf0
Author: Benjamin Jillich <jillich@amazon.com>
Date:   Wed Jan 19 10:36:20 2022 +0100

    Motion Matching: Architecture and Feature Schema Diagrams and README pass (#34)

    * Added architecture diagram
    * Added feature schema diagram
    * First pass on ReadMe.md

    Signed-off-by: Benjamin Jillich <jillich@amazon.com>

commit d50deb1f9eea79e9c56b3ded1bda62f755e3e97f
Author: Benjamin Jillich <jillich@amazon.com>
Date:   Wed Jan 19 10:35:44 2022 +0100

    Motion Matching: Refactoring [Part 3] Moved feature schema, matix and the kd-tree to the config and removed the feature database (#33)

    * Removed feature database which contained the feature schema, feature matrix and the kd-tree.
    * Feature schema, feature matrix and the kd-tree are now part of the motion matching config.
    * DebugDraw moved from the config to the instance.
    * FindLowestCostFrameIndex moved from the config to the instance.
    * Kd-tree now has a helper to calculate the number of dimensions based on a feature set.
    * SaveToCsv moved from the feature database to the feature matrix directly.

    Signed-off-by: Benjamin Jillich <jillich@amazon.com>

commit 09aff543822a2a3e092edc65cfca315a83e25730
Author: Benjamin Jillich <jillich@amazon.com>
Date:   Fri Jan 14 17:08:18 2022 +0100

    Motion Matching: Refactoring [Part 2] - Added new default feature schema and removed hard-coded locomotion config (#32)

    * Added a new FeatureSchemaDefault class that creates the default feature schema (left/right foot position and velocity, pelvis velocity and root trajectory).
    * Removed the LocomotionConfig and moved the functionality like the FindLowestCostFrameIndex() to the motion matching config.
    * Moved more per-instance data to the motion matching instance where they actually belong like the temporary cost vectors or the cached trajectory feature pointer.
    * Added cost factor to the feature base class so that users can weight the costs of the features and adjust them in the UI later on.
    * Removed the hard-coded cost factors from the motion matching node.
    * Using the new, customizable config rather than the hard-coded locomotion config in the motion matching node.

    Signed-off-by: Benjamin Jillich <jillich@amazon.com>

commit 8eba891a7b599d81a1ad7cbed841e342034fee9f
Author: Benjamin Jillich <jillich@amazon.com>
Date:   Thu Jan 13 08:56:26 2022 +0100

    Motion Matching: Refactoring [Part 1] (#31)

    * Storing the joint and relative to joint as strings in the feature so that we can later expose it to the UI.
    * Joint indices will be cached with initializing the features.
    * Some more code cleaning here and there.

    Signed-off-by: Benjamin Jillich <jillich@amazon.com>

commit 31e689015afaf61ededb50d4d3c5dddb61561603
Author: Benjamin Jillich <jillich@amazon.com>
Date:   Tue Jan 11 18:34:56 2022 +0100

    Motion Matching: Created feature schema class and separated funtionality out from the feature database (#30)

    * Created new feature schema class which holds the set of features involved in the motion matching search.  The schema represents the order of the features as well as their settings while the feature matrix stores the actual feature data.
    * Untangled feature database with the feature schema functionality and it now just holds a feature schema.
    * Code cleaning (removed Allocators.cpp, fixed alignment, renamed some variables.

    Signed-off-by: Benjamin Jillich <jillich@amazon.com>

commit c407959bd5dd1c38a283900815624a8103173d37
Author: Benjamin Jillich <jillich@amazon.com>
Date:   Tue Jan 11 13:11:51 2022 +0100

    Motion Matching: Eigen SDK is now optional and a simple NxN matrix equivalent is provided (#29)

    * Eigen SDK is now optional and users can opt-in manually.
    * Simple NxN matrix class provided for convenience that currently wraps all features needed to run motion matching (default).

    Signed-off-by: Benjamin Jillich <jillich@amazon.com>

commit dbd27a6ce68674e6fd3fcc9ca59c4d57884a133c
Author: Benjamin Jillich <jillich@amazon.com>
Date:   Mon Jan 10 08:54:01 2022 +0100

    Motion Matching: Added pose data for joint velocities and unified the broad-phase query filling (#28)

    * Added PoseDataJointVelocities class that extends the pose with relative to a given joint based velocities.
    * Unified the broad-phase search using the KD-tree by getting rid of the hard-coded fill query feature value calls. We're now just iterating over the features.
    * Adding the new pose data to the factory from within the MM gem.
    * Joint velocities are calculated before each motion matching search for the query pose.
    * Moved the debug draw helper from the locomotion config to the base class.
    * Some code cleaning.

    Signed-off-by: Benjamin Jillich <jillich@amazon.com>

commit 7ec4151fede3046379bdbdab86459cc3684c730c
Author: Benjamin Jillich <jillich@amazon.com>
Date:   Tue Jan 4 09:20:38 2022 +0100

    Motion Matching: Yet another round of timing improvements (#26)

    Motion Matching: Yet another timing improvement

    * Sampling the input query pose for the motion matching search algorithm for the new motion time as the motion instance has not been updated with the time delta when we do the search. So we pre-sample the future pose to not get little lags as the search was done on the last current pose, which is a time delta old already.
    * Fixed a bug: We never updated the previous motion instance while we were still blending between the poses. Basically we were blending with a static pose after switching to a new best matching frame. This increased smoothness obviously.
    * Re-enabled motion extraction delta blending now that we fixed the above bug which gave another smoothness increase.
    * Renaming MM behavior into config (#27)
    * Behavior -> MotionMatchingConfig
    * LocomotionBehavior -> LocomotionConfig
    * BehaviorInstance -> MotionMatchingInstance
    * Removed the MotionMatchSystem and cleaned up reflection
    * Moved to namespace EMotionFX::MotionMatching

    Signed-off-by: Benjamin Jillich <jillich@amazon.com>

commit e9c6d6fd74641bed90e57efeb5b65ad00ad44562
Author: Benjamin Jillich <jillich@amazon.com>
Date:   Fri Dec 10 10:52:06 2021 +0100

    Improved smoothness and increased trajectory facing direction influence (#25)

    * When switching to a new best matching frame we were lagging a frame behind when updating the motion instance time.
    * Increased the influence of the facing direction as we have two different position difference costs while only one facing direction cost.

    Signed-off-by: Benjamin Jillich <jillich@amazon.com>

commit f4a5d041a0d3c0bb747dd155a30a1ac993511db7
Author: Benjamin Jillich <jillich@amazon.com>
Date:   Wed Dec 8 08:54:16 2021 +0100

    Sampling poses for velocity feature directly from the motion data + some refactoring (#24)

    * New helper to calculate the velocity without a motion instance and sample the poses directly from the motion.
    * New helper to draw velocities other than the actual feature for a frame.
    * Now sharing the frame cost context between the position and the trajectory feature. The velocity feature can't use that yet as it needs custom velocity information. Will address that later.
    * After the changes, we were able to greatly simplify the extract features routine in the feature database as no motion instances are required anymore to extract the features from the source motions!
    * Some refactoring: Renamed some variables and classes for more sense.

    Signed-off-by: Benjamin Jillich <jillich@amazon.com>

commit 94e0ce6e6995cd4893eb5dada1119467dc26e360
Author: Benjamin Jillich <jillich@amazon.com>
Date:   Fri Dec 3 09:02:09 2021 +0100

    Sampling poses for trajectory feature directly from the motion data (#23)

    * Introduced a new SamplePose() helper to the Frame class which allows sampling a pose of the given frame without a motion instance. We can also apply a time offset to sample a pose before/after the frame which will be needed to sample trajectories or velocities as the Frame objects are sampled at 30 Hz currently. In case the offset reached the animation boundaries, the sample time will be clamped to the animation range.
    * The trajectory feature now samples the poses using the new helper directly from the motion data and does not need a motion instance object anymore which helps on the path towards multi-threading and also makes the code more readable and less error-prone.

    Signed-off-by: Benjamin Jillich <jillich@amazon.com>

commit 590ea3ddef575003f29e6d03b6db498a37da23a6
Author: Benjamin Jillich <jillich@amazon.com>
Date:   Thu Dec 2 09:20:48 2021 +0100

    Motion Matching: Motion extraction sometimes outputs zero movement and results in stuttering (#22)

    The problem was that after the motion matching algorithm determined a better matching frame and we started a cross-fade, at that given frame the motion extraction delta was outputting a zero vector, resulting in a little stutters at the mm update frequency. This happened because the exrtaction calculation was done before the motion instance time values were updated and also the previous time values weren't set correctly when switching the frame/animation.

    Signed-off-by: Benjamin Jillich <jillich@amazon.com>

commit 6d0b64fd4ce2e4b3ad696c6cc5f3935447a7782e
Author: Benjamin Jillich <jillich@amazon.com>
Date:   Wed Dec 1 17:05:49 2021 +0100

    Motion Matching: Facing direction support (#21)

    * Added facing direction to the trajectory feature based on a forward facing axis of the character asset.
    * The facing direction is calculated in model/animation space relative to the root joint of the character of the current frame.
    * As the trajectory is looking into the past/future of the current frame the facing direction is relative to the root joint of another frame, either in the future or the past.
    * The facing direction cost is the normalized difference (normalized dot product) between the current character facing direction and the expected one from the used frame in the database.
    * The trajectory history as well as the trajectory query got extended by a facing direction.
    * Added visualization for facing direction.

    Signed-off-by: Benjamin Jillich <jillich@amazon.com>

commit a552881139debfba2b489b8e23d57cb2556635c1
Author: Benjamin Jillich <jillich@amazon.com>
Date:   Wed Nov 24 11:02:03 2021 +0100

    Motion Matching: Moved velocity feature from normalized direction and speed to a scaled vector (#20)

    * More stable representation for joints that remain motionless.
    * Skipping velocity debug visualizations for zero velocities.

    Signed-off-by: Benjamin Jillich <jillich@amazon.com>

commit 5a27e1db3de3d4ac753208a3c4af9bc2115f5b40
Author: Benjamin Jillich <jillich@amazon.com>
Date:   Thu Nov 18 13:11:15 2021 +0100

    Motion Matching: Final tweaks for Phase 1 milestone (#19)

    * Added pelvis velocity feature to smooth out sudden direction changes of it.
    * Blending motion extraction delta as it results in smoother motion extraction after all of the fixes from the last weeks

    Signed-off-by: Benjamin Jillich <jillich@amazon.com>

commit c928dc5502cc9cbb4df3fccdae60039e68571c77
Author: Benjamin Jillich <jillich@amazon.com>
Date:   Tue Nov 16 12:45:13 2021 +0100

    Motion Matching: Move debug rendering to use DebugDisplayBus & some code cleaning (#18)

    * Debug drawing for LY Editor as well as Animation Editor in the new Atom viewport.
    * Ported from Atom aux geom to render backend independent debug display.
    * Renamed last occurances of frame data to feature.

    Signed-off-by: Benjamin Jillich <jillich@amazon.com>

commit 57904c55d15694e60713f90f42715c18756f52c4
Author: Benjamin Jillich <jillich@amazon.com>
Date:   Thu Nov 11 16:38:22 2021 +0100

    Motion Matching: Move control spline into separate trajectory query class (#17)

    * Introducing new TrajectoryQuery class which represents the trajectory history and desired future trajectory control points input for the motion matching search.
    * The trajectory history will be used to sample the number of expected control points for the past trajectory based on the trajectory feature.
    * For the future trajectory, I just ported over the different functions that we had previously. Will revisit that with a later change.
    * Moved DebugDrawControlSpline() from locomotion behavior to the TrajectoryQuery.
    * Trajectory query is owned by the behavior instance, same as the trajectory history.

    Signed-off-by: Benjamin Jillich <jillich@amazon.com>

commit fbed5b1803f820a86ffa562a2be9bbae00c64109
Author: Benjamin Jillich <jillich@amazon.com>
Date:   Tue Nov 9 10:36:03 2021 +0100

    Motion Matching: Trajectory feature improvements (#16)

    * Reduced dimensionality of the trajectory feature. Removed the velocity from the sample as the velocity is embedded in the distances between the control points already.
    * Reduced the position and facing direction features from 3D to 2D as motion extraction is projected to the ground plane anyway and the last component was always zeroed out.
    * Improved the cost function by removing the angle difference from it. After the latest changes in the trajectory history and the bug fixes in the trajectory feature and history, results are better without it.
    * Adjusted some target path generation parameters to get more variation in the locomotion results.
    * Some minor code cleanup.

    Signed-off-by: Benjamin Jillich <jillich@amazon.com>

commit 796b763d676380e0b9fbe0ae72824a7f29e59eab
Author: Benjamin Jillich <jillich@amazon.com>
Date:   Thu Nov 4 11:58:39 2021 +0100

    Major rewrite of the trajectory history (#15)

    * Fixed several sampling issues related to indexing issues by elevating the keytrack class that has been proven to work rather than reinventing the wheel here.
    * Offers two ways to sample the trajectory history, either based on time in seconds or using a normalized value.
    * Automatically takes care that the trajectory feature is not needing a longer history than the history itself records.
    * History older than the trajectory feature requres is rendered as semi-transparent spheres that fade out over time.
    * Trajectory history is now owned by the behavior instance rather than the anim graph node to have everything at a central place.
    * Trajectory history is pre-filled with the character location at init time so that we simulate a standing still character without confusing the motion matching algorithm or special case handling.

    Signed-off-by: Benjamin Jillich <jillich@amazon.com>

commit 80c6572380e1b879014aae5aecdaaea1aa3160c1
Author: Benjamin Jillich <jillich@amazon.com>
Date:   Wed Nov 3 09:34:16 2021 +0100

    Motion Matching: Improve trajectory feature extraction and cost calculation (#14)

    * Fixed bug in calculating the past frame index which led to retrieving wrong data and skewed costs.
    * Fixed an off-by-one indexing issue. We were only seeing 5 viual control points in our debug viz while there should have been 6. That is now fixed + the cost is more accurate as we're actually comparing the right control points now.
    * Changed the cost function from just calculating the spatial differences between the desired and actual trajectory positions to a combination of relative spatial differences and their angles. This reduced the average costs and resulted in better matching frames and smoother synthesized animations.

    Signed-off-by: Benjamin Jillich <jillich@amazon.com>

commit 7cc180054ede4fab8ba592c45c2feaf746beeb28
Author: Benjamin Jillich <jillich@amazon.com>
Date:   Fri Oct 29 08:54:41 2021 +0200

    Remove bad animation ranges with discard range events (#12)

    We got a mechanism in place to discard specific frames/ranges of an animation and exclude it from the motion matching database. We have several sections in our animations where the arms are over the head, where the person suddenly crouches for a bit or other poses that do not match the other locomotion data. I went through the animation database and discarded all of these sections in order to improve the results of the synthesized animations.

    Signed-off-by: Benjamin Jillich <jillich@amazon.com>

commit 14f36cdc1813aaa9a5ce514272cef01bad3f38b6
Author: Benjamin Jillich <jillich@amazon.com>
Date:   Fri Oct 29 08:54:13 2021 +0200

    Remove direction feature (facing direction will be part of the trajectory) (#13)

    Removed the currently unused and half implemented direction feature. The facing direction will be part of the trajectory in the future.

    Signed-off-by: Benjamin Jillich <jillich@amazon.com>

commit a8e6f1c42f92cb3a207bf04371e884d8f5d0fbb4
Author: Benjamin Jillich <jillich@amazon.com>
Date:   Tue Oct 26 13:30:07 2021 +0200

    Motion Matching: Improved spatial velocity feature calculation (#11)

    * Added Motion Matching ImGuiMonitor and Bus

    Added bus for pushing values to the histogram and a monitor owning and rendering the histograms.

    * Implemented performance and feature cost histograms

    * Improved spatial velocity feature calculation

    Signed-off-by: Benjamin Jillich <jillich@amazon.com>

commit 925e05bdd3c433af767118eff7b3c93bb6e2a94a
Author: Benjamin Jillich <jillich@amazon.com>
Date:   Mon Oct 25 10:26:37 2021 +0200

    Add position feature visualization and other debug viz improvements (#10)

    * Rendering a sphere for the position feature in order to see the offset between the actual foot position and the best matching frame's extracted position feature.
    * Improved velocity feature visualization by making rendering an arrow head and a thicker direction.
    * Added the ability to have different bar colors for different histograms.
    * Matched histogram bar with feature visualization colors to easily understand which 3D viz belongs to which histogram.
    * Using better matching color palette.

    Signed-off-by: Benjamin Jillich <jillich@amazon.com>

commit 6600abcff16df4e49a744bbeb68d00d9fe03f31d
Author: Benjamin Jillich <jillich@amazon.com>
Date:   Wed Oct 20 17:18:05 2021 +0200

    Use feature matrix to generalize FillFrameFloats() and improve feature visualizations (#9)

    * Generalized the FillFrameFloats() in the KD-tree by utilizing the feature matrix and dimensionality information from the features to replace the custom per-feature functions with a shared version.
    * Improved the trajectory visualization by consolidating the past and future trajectory rendering into a single function and nicer visuals by replacing the line based markers with spheres and cylinders to fake thicker line until the aux geom is able to render them.
    * Now using Atom's aux geom for improved rendering of the features.
    * Included some more feedback from the last PR.

    Signed-off-by: Benjamin Jillich <jillich@amazon.com>

commit 96a90bc62cf4220d1afd207fce9c6cab9e453c3b
Author: Benjamin Jillich <jillich@amazon.com>
Date:   Wed Oct 20 09:04:07 2021 +0200

    Motion Matching: Store features used for kd-tree in feature database and get rid of the local flags inside the feature class (#8)

    * The list of features used in the KD-tree is now separated and not part of the actual feature descriptor anymore
    * Renamed frame floats to query features / feature vales to make it align better with the rewrite from some weeks ago.
    * Some more code cleanup

    Signed-off-by: Benjamin Jillich <jillich@amazon.com>

commit e430637f734b20e6813a5c44572645390fe77c92
Author: Benjamin Jillich <jillich@amazon.com>
Date:   Fri Oct 15 09:25:49 2021 +0200

    Feature cost and performance metrics visualization (#7)

    * Added Motion Matching ImGuiMonitor and Bus

    Added bus for pushing values to the histogram and a monitor owning and rendering the histograms.

    Signed-off-by: Benjamin Jillich <jillich@amazon.com>

    * Implemented performance and feature cost histograms

    Signed-off-by: Benjamin Jillich <jillich@amazon.com>

commit f8ca765fcc168942c361dfea249b08915dac5f77
Author: Benjamin Jillich <jillich@amazon.com>
Date:   Mon Oct 11 09:21:17 2021 +0200

    Motion matching: Data analysis and Visualization (Part 2) (#6)

    Added scatterplot using PCA
    Added feature correlation heatmap
    Data normalization ground truth using sklearn's scaler (Min-max scaling)
    Histogram for verifying that the value distributions stayed the same after normalizing
    PCA scatterplot for the normalized data

    Signed-off-by: Benjamin Jillich <jillich@amazon.com>

commit 21d3f8b9e4b2545bfe07bf2a08ed1e39337dfc58
Merge: 58abd6c a6587bb
Author: Benjamin Jillich <jillich@amazon.com>
Date:   Thu Oct 7 09:16:39 2021 +0200

    Motion Matching: Feature matrix CSV export

    Signed-off-by: Benjamin Jillich <jillich@amazon.com>

commit a6587bb9218e2c618ac3a836b6374ab16fcc9bbc
Merge: 3f5ac80 aee9039
Author: Benjamin Jillich <jillich@amazon.com>
Date:   Thu Oct 7 09:13:32 2021 +0200

    Motion Matching: Start of a Jupyter notebook for feature data analysis and visualizations

    Signed-off-by: Benjamin Jillich <jillich@amazon.com>

commit aee903942ed0b05d9397778d606775482dc1cfc7
Author: Benjamin Jillich <jillich@amazon.com>
Date:   Wed Oct 6 15:35:14 2021 +0200

    Motion Matching: Feature matrix CSV export

    Signed-off-by: Benjamin Jillich <jillich@amazon.com>

commit 3f5ac808323a7b6a78cdf5c1f8ceaf8d175e9857
Author: Benjamin Jillich <jillich@amazon.com>
Date:   Tue Oct 5 10:04:44 2021 +0200

    Motion Matching: Feature matrix CSV export

    * Added GetDimensionName() function for the feature to output a component's name, which corresponds to a column in the feature matrix.
    * Added SaveAsCsv() function to feature matrix which exports a Eigen matrix to a .csv file plus column names based on the feature component names.

    Signed-off-by: Benjamin Jillich <jillich@amazon.com>

commit 58abd6c9600f67121eaec9b80c2264bba73ff198
Merge: 1fb12e4 fd12fda
Author: Benjamin Jillich <jillich@amazon.com>
Date:   Mon Oct 4 08:44:25 2021 +0200

    Motion Matching: Created feature matrix and moved all feature data to it

    Signed-off-by: Benjamin Jillich <jillich@amazon.com>

commit fd12fda7843ec6d361caa3a26892fcdb0dfc9f7c
Author: Benjamin Jillich <jillich@amazon.com>
Date:   Fri Oct 1 09:01:24 2021 +0200

    Created feature matrix and moved all feature data into it

    We now have a new FeatureMatrix which is responsible for storing the feature data that we need for the motion matching algorithm in a cache-efficient way.
    The great thing is that this is also the enabler for any feature analysis and later on machine learning as we're in the right data format already.

    The FeatureMatrix internally stores the data in a 2D dense matrix from the Eigen library. This can easily be replaced with another linear algebra/vector/matrix library though and the FeatureMatrix acts as a wrapper.

    We're currently extracting the following motion features from our motion database:
    Left foot position/velocity, right foot position/velocity and the root joint trajectory with 6 past and 6 future sample positions and directions.

    Having 41203 keyframes/poses in our motion database, this results in a feature matrix holding 22.63 MB of data. This is only the data size for the extracted features.

    Signed-off-by: Benjamin Jillich <jillich@amazon.com>

commit 7cdd4d9b0b525b3a7c218ce4a16dfa9d2a90132c
Author: Benjamin Jillich <jillich@amazon.com>
Date:   Wed Sep 22 14:21:14 2021 +0200

    Adding profile instrumentations

    Signed-off-by: Benjamin Jillich <jillich@amazon.com>

commit 1fb12e4af1fd1d70c8f39842a0207de96bd1550c
Author: Benjamin Jillich <jillich@amazon.com>
Date:   Fri Sep 17 13:10:55 2021 +0200

    Fixing compile issues due to new warning mode

    Signed-off-by: Benjamin Jillich <jillich@amazon.com>

commit 9ea3a67aa985d872498fcc19bb973311089f9679
Merge: 5e5d69c 99cfb29
Author: Benjamin Jillich <jillich@amazon.com>
Date:   Wed Sep 15 17:49:09 2021 +0200

    Added velocity feature visualization improved velocity calculation

    Signed-off-by: Benjamin Jillich <jillich@amazon.com>

commit 99cfb2914cdadcf5616570c30a15b246abb20f97
Author: Benjamin Jillich <jillich@amazon.com>
Date:   Wed Sep 8 16:35:12 2021 +0200

    Addressed PR feedback

    Signed-off-by: Benjamin Jillich <jillich@amazon.com>

commit af291c0ca97e17d1bb07075a5546747591812e9e
Author: Benjamin Jillich <jillich@amazon.com>
Date:   Tue Sep 7 10:36:18 2021 +0200

    Added velocity feature visualization improved velocity calculation

    Signed-off-by: Benjamin Jillich <jillich@amazon.com>

commit 5e5d69c9293c3609950e55d19ea6145a21027187
Merge: f2c1577 902179c
Author: Benjamin Jillich <jillich@amazon.com>
Date:   Mon Sep 6 18:58:59 2021 +0200

    MotionMatching: Separated features from source data and renamed FrameData to Feature #1

    Signed-off-by: Benjamin Jillich <jillich@amazon.com>

commit 902179c6a19f91bca64c37a383adc776da15f8e8
Author: Benjamin Jillich <jillich@amazon.com>
Date:   Fri Sep 3 15:25:27 2021 +0200

    Separated features from source data and renamed FrameData to Feature

    Signed-off-by: Benjamin Jillich <jillich@amazon.com>

commit f2c1577febb46d8957693826d404f8cc6ad5e240
Author: Benjamin Jillich <jillich@amazon.com>
Date:   Mon Aug 30 12:10:30 2021 +0200

    Fixed some compile errors for the latest version

    Signed-off-by: Benjamin Jillich <jillich@amazon.com>

commit 498018553ccf75b0e3983895984401680ffa25b9
Author: Benjamin Jillich <jillich@amazon.com>
Date:   Mon Aug 30 11:51:21 2021 +0200

    Initial motion matching prototype code

    Signed-off-by: Benjamin Jillich <jillich@amazon.com>

Signed-off-by: Benjamin Jillich <jillich@amazon.com>
monroegm-disable-blank-issue-2
Benjamin Jillich 4 years ago committed by GitHub
parent d7b3d33711
commit 2068c225d1
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23

@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:33d5b3035966ac54bbb97a207ca14868a6daef9ad9e596281f7930764b54c819
size 2527664

@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:91d6a8c16bae554339d1849285804b798e136113c6aecc3dfed1be635c484600
size 4750720

@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:e18101baa7b33af4ac26181b7c3c20d5c12e9d9d2f57bd24271c6e4665e1a65a
size 3001296

@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:8eda027438797e48e6b00745e17e3ec4da6d507735c754056a8fea19c465a528
size 5926096

@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:0c2af3ea28464b4d7269f54c1ff1ded691b19b826d5de85139d20ae825aae992
size 5085168

@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:25593a1e4670dbb9cb1c5a0fc375ba2bf7862784c45847d76eefd44d39df86f9
size 2974896

@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:e21e3f9b7db6572f740a99159bba72eca3697d3d3e1c07562579be9599bcebb7
size 2511488

@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:3802b0e44b3e0aa6610a4157510c3038bfe4038211202cebd6eb116ffd53cee9
size 2587408

@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:0f07e278fcee81d719ec2e7c02417e2ee710b9d224a29798cfb97b55a9ffe351
size 3515488

@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:cc4d0b653f910cea3c7c3b8d08963af0cc331ad91e467c7becb2c7f9b9c5b402
size 4068144

@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:58cd43568cdd3f9b9585b65208cfff6d6c5f5bea9ad67878a1efbeebb8ef8a6d
size 1812176

@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:18a831cfe702120d163e44a27d5cd33faf64f5f892da3789aea1bbfecd528e72
size 3288400

@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:3649e2d0b239f9da45af2b04e937ce6113e8dee74a70a3cef748995ea15f6120
size 2354544

@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:207378e42ebdad1c3dfe8cefc84ed50e1d322895f018e2efca14fdcbf2e6600f
size 1890352

@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:78c68566aba845544ec96a92f2cf4e36350c74c16e931c3aa9e14151bcaed8e8
size 3881680

@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:73560b662d3819647985c3f7b0544ba3ad71365e8393a3915630b254f86c6286
size 3669552

@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:e2fad264af33252d49c6e3a0b6c6ef83697b951a54c7ab0cd4259f145024d915
size 3962336

@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:149ae8bfc0dc4ad8471e79f5cd8ce5420c02bc5c8a5cf1971ebbf997cae21096
size 3834704

@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:b6468a660cfd88c350443e92b31e41362909d705c344cda6640c65a5e1fd2a29
size 2288048

@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:b824d6fd5bb3e7f6b4f29ff90b406179539e1c155ba9f8870ef54f1d1d0e22e7
size 3128032

@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:38f0f9e280ff686032aa223cc8ce754442edc616d28a16d39a87c90bd221c91a
size 2807280

@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:862ddf8893923169938db017f573f63228bb5aec3d4892d21a525200dd6c4680
size 4063280

@ -0,0 +1,50 @@
{
"values": [
{
"$type": "MotionGroup",
"name": "acceleration1",
"selectedRootBone": "RootNode.root",
"id": "{ADB2CDC1-8EA3-5B21-90D6-43EBE9991709}",
"rules": {
"rules": [
{
"$type": "EMotionFX::Pipeline::Rule::MotionMetaDataRule",
"data": {
"motionEventTable": {
"tracks": [
{
"name": "Sync",
"deletable": false,
"events": [
{
"eventDatas": [
{
"$type": "DiscardFrameEventData"
}
],
"startTime": 19.600000381469727,
"endTime": 20.666667938232422
},
{
"eventDatas": [
{
"$type": "DiscardFrameEventData"
}
],
"startTime": 50.599998474121094,
"endTime": 53.19999694824219
}
]
}
]
}
}
},
{
"$type": "MotionSamplingRule"
}
]
}
}
]
}

@ -0,0 +1,41 @@
{
"values": [
{
"$type": "MotionGroup",
"name": "circles1",
"selectedRootBone": "RootNode.root",
"id": "{BF21E0D5-87F6-5A3F-B100-507F217D4C7E}",
"rules": {
"rules": [
{
"$type": "EMotionFX::Pipeline::Rule::MotionMetaDataRule",
"data": {
"motionEventTable": {
"tracks": [
{
"name": "Sync",
"deletable": false,
"events": [
{
"eventDatas": [
{
"$type": "DiscardFrameEventData"
}
],
"startTime": 112.73334503173828,
"endTime": 114.00001525878906
}
]
}
]
}
}
},
{
"$type": "MotionSamplingRule"
}
]
}
}
]
}

@ -0,0 +1,125 @@
<ObjectStream version="3">
<Class name="SceneManifest" version="1" type="{9274AD17-3212-4651-9F3B-7DCCB080E467}">
<Class name="AZStd::vector" field="values" type="{5D6A7C67-11CA-59A4-829B-0B20B781B292}">
<Class name="AZStd::shared_ptr" field="element" type="{EB7522F9-0E87-55A9-A191-E924DC5AE867}">
<Class name="MotionGroup" field="element" version="3" type="{1B0ABB1E-F6DF-4534-9A35-2DD8244BF58C}">
<Class name="IMotionGroup" field="BaseClass1" version="1" type="{1CA400A8-2C3E-423D-B8A3-C457EF88E533}">
<Class name="IGroup" field="BaseClass1" version="1" type="{DE008E67-790D-4672-A73A-5CA0F31EDD2D}">
<Class name="IManifestObject" field="BaseClass1" type="{3B839407-1884-4FF4-ABEA-CA9D347E83F7}"/>
</Class>
</Class>
<Class name="AZStd::string" field="name" value="crouching1" type="{03AAAB3F-5C47-5A66-9EBC-D5FA4DB353C9}"/>
<Class name="AZStd::string" field="selectedRootBone" value="RootNode.root" type="{03AAAB3F-5C47-5A66-9EBC-D5FA4DB353C9}"/>
<Class name="AZ::Uuid" field="id" value="{A552C76F-FAA4-5657-8955-CC64913C9C9C}" type="{E152C105-A133-4D03-BBF8-3D4B2FBA3E2A}"/>
<Class name="RuleContainer" field="rules" version="1" type="{2C20D3DF-57FF-4A31-8680-A4D45302B9CF}">
<Class name="AZStd::vector" field="rules" type="{B5BDB053-178F-5D55-8663-70897A71B7C9}">
<Class name="AZStd::shared_ptr" field="element" type="{0BB4AFBA-F087-55C7-95DF-01D71F6CB052}">
<Class name="CoordinateSystemRule" field="element" version="1" type="{603207E2-4F55-4C33-9AAB-98CA75C1E351}">
<Class name="IRule" field="BaseClass1" version="1" type="{81267F8B-3963-423B-9FF7-D276D82CD110}">
<Class name="IManifestObject" field="BaseClass1" type="{3B839407-1884-4FF4-ABEA-CA9D347E83F7}"/>
</Class>
<Class name="int" field="targetCoordinateSystem" value="0" type="{72039442-EB38-4D42-A1AD-CB68F7E0EEF6}"/>
</Class>
</Class>
<Class name="AZStd::shared_ptr" field="element" type="{0BB4AFBA-F087-55C7-95DF-01D71F6CB052}">
<Class name="MetaDataRule" field="element" version="2" type="{8D759063-7D2E-4543-8EB3-AB510A5886CF}">
<Class name="IManifestObject" field="BaseClass1" type="{3B839407-1884-4FF4-ABEA-CA9D347E83F7}"/>
<Class name="AZStd::vector" field="commands" type="{C9984A24-DA9E-518F-9F81-27E51FAEB1F7}">
<Class name="CommandSystem::CommandAdjustMotion" field="element" version="1" type="{A8977553-4011-4BEB-97C8-6AE44B07C7A8}">
<Class name="MCore::Command" field="BaseClass1" version="1" type="{49C636CE-7C0E-408A-A0F7-F7D12647EFBA}"/>
<Class name="CommandSystem::MotionIdCommandMixin" field="BaseClass2" version="1" type="{968E9513-3159-4469-B5FA-97D0920456E3}">
<Class name="int" field="motionID" value="0" type="{72039442-EB38-4D42-A1AD-CB68F7E0EEF6}"/>
</Class>
<Class name="AZStd::optional" field="dirtyFlag" type="{0170062C-2E7E-5CEB-BAB8-F7663BEF7B3E}"/>
<Class name="AZStd::optional" field="motionExtractionFlags" type="{43BA4537-CBCC-54F5-B403-84188A203D60}">
<Class name="unsigned char" field="element" value="0" type="{72B9409A-7D1A-4831-9CFE-FCB3FADD3426}"/>
</Class>
<Class name="AZStd::optional" field="name" type="{B0D91084-263A-54B9-A4F3-7C5F4240E248}"/>
</Class>
<Class name="CommandSystem::CommandCreateMotionEventTrack" field="element" version="1" type="{961F762D-5B90-4E21-8692-9FADDCA54E6C}">
<Class name="MCore::Command" field="BaseClass1" version="1" type="{49C636CE-7C0E-408A-A0F7-F7D12647EFBA}"/>
<Class name="CommandSystem::MotionIdCommandMixin" field="BaseClass2" version="1" type="{968E9513-3159-4469-B5FA-97D0920456E3}">
<Class name="int" field="motionID" value="0" type="{72039442-EB38-4D42-A1AD-CB68F7E0EEF6}"/>
</Class>
<Class name="AZStd::string" field="eventTrackName" value="Sync" type="{03AAAB3F-5C47-5A66-9EBC-D5FA4DB353C9}"/>
<Class name="AZStd::optional" field="eventTrackIndex" type="{5BC30B08-5E9C-5B73-BDE4-4BE8170C21C6}"/>
<Class name="AZStd::optional" field="isEnabled" type="{0170062C-2E7E-5CEB-BAB8-F7663BEF7B3E}"/>
</Class>
<Class name="CommandSystem::CommandCreateMotionEventTrack" field="element" version="1" type="{961F762D-5B90-4E21-8692-9FADDCA54E6C}">
<Class name="MCore::Command" field="BaseClass1" version="1" type="{49C636CE-7C0E-408A-A0F7-F7D12647EFBA}"/>
<Class name="CommandSystem::MotionIdCommandMixin" field="BaseClass2" version="1" type="{968E9513-3159-4469-B5FA-97D0920456E3}">
<Class name="int" field="motionID" value="0" type="{72039442-EB38-4D42-A1AD-CB68F7E0EEF6}"/>
</Class>
<Class name="AZStd::string" field="eventTrackName" value="MM Discards" type="{03AAAB3F-5C47-5A66-9EBC-D5FA4DB353C9}"/>
<Class name="AZStd::optional" field="eventTrackIndex" type="{5BC30B08-5E9C-5B73-BDE4-4BE8170C21C6}"/>
<Class name="AZStd::optional" field="isEnabled" type="{0170062C-2E7E-5CEB-BAB8-F7663BEF7B3E}"/>
</Class>
<Class name="CommandSystem::CommandCreateMotionEvent" field="element" type="{D19C2AFB-A5AA-4367-BCFC-02EB88C1B61F}">
<Class name="MCore::Command" field="BaseClass1" version="1" type="{49C636CE-7C0E-408A-A0F7-F7D12647EFBA}"/>
<Class name="CommandSystem::MotionIdCommandMixin" field="BaseClass2" version="1" type="{968E9513-3159-4469-B5FA-97D0920456E3}">
<Class name="int" field="motionID" value="0" type="{72039442-EB38-4D42-A1AD-CB68F7E0EEF6}"/>
</Class>
<Class name="AZStd::string" field="eventTrackName" value="MM Discards" type="{03AAAB3F-5C47-5A66-9EBC-D5FA4DB353C9}"/>
<Class name="float" field="startTime" value="0.0000000" type="{EA2C3E90-AFBE-44D4-A90D-FAAF79BAF93D}"/>
<Class name="float" field="endTime" value="3.9581690" type="{EA2C3E90-AFBE-44D4-A90D-FAAF79BAF93D}"/>
<Class name="AZStd::optional" field="eventDatas" type="{C69F2148-2DBF-5C3D-8154-447C6E99D041}">
<Class name="AZStd::vector" field="element" type="{4918F644-6F31-5DC0-92CC-471D16563C8A}">
<Class name="AZStd::shared_ptr" field="element" type="{FA64741F-5600-53BC-9B56-A81DD4938FF5}">
<Class name="MotionMatchEventData" field="element" version="1" type="{25499823-E611-4958-85B7-476BC1918744}">
<Class name="EventData" field="BaseClass1" version="1" type="{F6AFCD3B-D58E-4821-9E7C-D1F437304E5D}"/>
<Class name="AZStd::string" field="tag" value="discard" type="{03AAAB3F-5C47-5A66-9EBC-D5FA4DB353C9}"/>
</Class>
</Class>
</Class>
</Class>
</Class>
<Class name="CommandSystem::CommandCreateMotionEvent" field="element" type="{D19C2AFB-A5AA-4367-BCFC-02EB88C1B61F}">
<Class name="MCore::Command" field="BaseClass1" version="1" type="{49C636CE-7C0E-408A-A0F7-F7D12647EFBA}"/>
<Class name="CommandSystem::MotionIdCommandMixin" field="BaseClass2" version="1" type="{968E9513-3159-4469-B5FA-97D0920456E3}">
<Class name="int" field="motionID" value="0" type="{72039442-EB38-4D42-A1AD-CB68F7E0EEF6}"/>
</Class>
<Class name="AZStd::string" field="eventTrackName" value="MM Discards" type="{03AAAB3F-5C47-5A66-9EBC-D5FA4DB353C9}"/>
<Class name="float" field="startTime" value="27.8391228" type="{EA2C3E90-AFBE-44D4-A90D-FAAF79BAF93D}"/>
<Class name="float" field="endTime" value="31.8632603" type="{EA2C3E90-AFBE-44D4-A90D-FAAF79BAF93D}"/>
<Class name="AZStd::optional" field="eventDatas" type="{C69F2148-2DBF-5C3D-8154-447C6E99D041}">
<Class name="AZStd::vector" field="element" type="{4918F644-6F31-5DC0-92CC-471D16563C8A}">
<Class name="AZStd::shared_ptr" field="element" type="{FA64741F-5600-53BC-9B56-A81DD4938FF5}">
<Class name="MotionMatchEventData" field="element" version="1" type="{25499823-E611-4958-85B7-476BC1918744}">
<Class name="EventData" field="BaseClass1" version="1" type="{F6AFCD3B-D58E-4821-9E7C-D1F437304E5D}"/>
<Class name="AZStd::string" field="tag" value="discard" type="{03AAAB3F-5C47-5A66-9EBC-D5FA4DB353C9}"/>
</Class>
</Class>
</Class>
</Class>
</Class>
<Class name="CommandSystem::CommandCreateMotionEvent" field="element" type="{D19C2AFB-A5AA-4367-BCFC-02EB88C1B61F}">
<Class name="MCore::Command" field="BaseClass1" version="1" type="{49C636CE-7C0E-408A-A0F7-F7D12647EFBA}"/>
<Class name="CommandSystem::MotionIdCommandMixin" field="BaseClass2" version="1" type="{968E9513-3159-4469-B5FA-97D0920456E3}">
<Class name="int" field="motionID" value="0" type="{72039442-EB38-4D42-A1AD-CB68F7E0EEF6}"/>
</Class>
<Class name="AZStd::string" field="eventTrackName" value="MM Discards" type="{03AAAB3F-5C47-5A66-9EBC-D5FA4DB353C9}"/>
<Class name="float" field="startTime" value="50.0708389" type="{EA2C3E90-AFBE-44D4-A90D-FAAF79BAF93D}"/>
<Class name="float" field="endTime" value="67.1569366" type="{EA2C3E90-AFBE-44D4-A90D-FAAF79BAF93D}"/>
<Class name="AZStd::optional" field="eventDatas" type="{C69F2148-2DBF-5C3D-8154-447C6E99D041}">
<Class name="AZStd::vector" field="element" type="{4918F644-6F31-5DC0-92CC-471D16563C8A}">
<Class name="AZStd::shared_ptr" field="element" type="{FA64741F-5600-53BC-9B56-A81DD4938FF5}">
<Class name="MotionMatchEventData" field="element" version="1" type="{25499823-E611-4958-85B7-476BC1918744}">
<Class name="EventData" field="BaseClass1" version="1" type="{F6AFCD3B-D58E-4821-9E7C-D1F437304E5D}"/>
<Class name="AZStd::string" field="tag" value="discard" type="{03AAAB3F-5C47-5A66-9EBC-D5FA4DB353C9}"/>
</Class>
</Class>
</Class>
</Class>
</Class>
</Class>
<Class name="AZStd::string" field="metaData" value="" type="{03AAAB3F-5C47-5A66-9EBC-D5FA4DB353C9}"/>
</Class>
</Class>
</Class>
</Class>
</Class>
</Class>
</Class>
</Class>
</ObjectStream>

@ -0,0 +1,59 @@
{
"values": [
{
"$type": "MotionGroup",
"name": "freeroaming1",
"selectedRootBone": "RootNode.root",
"id": "{A07E54E7-BB49-5DB3-BCA1-5EC8B4FA74A3}",
"rules": {
"rules": [
{
"$type": "EMotionFX::Pipeline::Rule::MotionMetaDataRule",
"data": {
"motionEventTable": {
"tracks": [
{
"name": "Sync",
"deletable": false,
"events": [
{
"eventDatas": [
{
"$type": "DiscardFrameEventData"
}
],
"startTime": 73.33333587646484,
"endTime": 111.13333129882813
},
{
"eventDatas": [
{
"$type": "DiscardFrameEventData"
}
],
"startTime": 134.3333282470703,
"endTime": 136.13333129882813
},
{
"eventDatas": [
{
"$type": "DiscardFrameEventData"
}
],
"startTime": 145.1999969482422,
"endTime": 146.13333129882813
}
]
}
]
}
}
},
{
"$type": "MotionSamplingRule"
}
]
}
}
]
}

@ -0,0 +1,41 @@
{
"values": [
{
"$type": "MotionGroup",
"name": "freeroaming2",
"selectedRootBone": "RootNode.root",
"id": "{96DC1ABD-1F72-5546-8B7F-7092C3AC0E5D}",
"rules": {
"rules": [
{
"$type": "EMotionFX::Pipeline::Rule::MotionMetaDataRule",
"data": {
"motionEventTable": {
"tracks": [
{
"name": "Sync",
"deletable": false,
"events": [
{
"eventDatas": [
{
"$type": "DiscardFrameEventData"
}
],
"startTime": 119.26667022705078,
"endTime": 123.4000015258789
}
]
}
]
}
}
},
{
"$type": "MotionSamplingRule"
}
]
}
}
]
}

@ -0,0 +1,41 @@
{
"values": [
{
"$type": "MotionGroup",
"name": "jog1",
"selectedRootBone": "RootNode.root",
"id": "{9200D325-808C-5B2D-B323-1FF9790C07B7}",
"rules": {
"rules": [
{
"$type": "EMotionFX::Pipeline::Rule::MotionMetaDataRule",
"data": {
"motionEventTable": {
"tracks": [
{
"name": "Sync",
"deletable": false,
"events": [
{
"eventDatas": [
{
"$type": "DiscardFrameEventData"
}
],
"startTime": 64.0,
"endTime": 65.53333282470703
}
]
}
]
}
}
},
{
"$type": "MotionSamplingRule"
}
]
}
}
]
}

@ -0,0 +1,53 @@
{
"values": [
{
"$type": "MotionGroup",
"name": "jogpivotturn1",
"selectedRootBone": "RootNode.root",
"id": "{596210E7-A7F4-511D-886B-AA4FED4AC92B}",
"rules": {
"rules": [
{
"$type": "EMotionFX::Pipeline::Rule::MotionMetaDataRule",
"data": {
"motionEventTable": {
"tracks": [
{
"name": "Sync",
"deletable": false
},
{
"name": "Event Track 2",
"events": [
{
"eventDatas": [
{
"$type": "DiscardFrameEventData"
}
],
"startTime": 36.66666793823242,
"endTime": 40.733333587646484
},
{
"eventDatas": [
{
"$type": "DiscardFrameEventData"
}
],
"startTime": 41.733333587646484,
"endTime": 53.06666564941406
}
]
}
]
}
}
},
{
"$type": "MotionSamplingRule"
}
]
}
}
]
}

@ -0,0 +1,53 @@
{
"values": [
{
"$type": "MotionGroup",
"name": "mixedlocomotion1",
"selectedRootBone": "RootNode.root",
"id": "{2F7682E4-235E-5A31-B450-266D7DC00E39}",
"rules": {
"rules": [
{
"$type": "EMotionFX::Pipeline::Rule::MotionMetaDataRule",
"data": {
"motionEventTable": {
"tracks": [
{
"name": "Sync",
"deletable": false
},
{
"name": "Event Track 2",
"events": [
{
"eventDatas": [
{
"$type": "DiscardFrameEventData"
}
],
"startTime": 27.53333282470703,
"endTime": 29.999998092651367
},
{
"eventDatas": [
{
"$type": "DiscardFrameEventData"
}
],
"startTime": 56.53333282470703,
"endTime": 60.666664123535156
}
]
}
]
}
}
},
{
"$type": "MotionSamplingRule"
}
]
}
}
]
}

@ -0,0 +1,41 @@
{
"values": [
{
"$type": "MotionGroup",
"name": "outofrange1",
"selectedRootBone": "RootNode.root",
"id": "{6B28C886-471C-5506-AD03-DF19025F6DA1}",
"rules": {
"rules": [
{
"$type": "EMotionFX::Pipeline::Rule::MotionMetaDataRule",
"data": {
"motionEventTable": {
"tracks": [
{
"name": "Sync",
"deletable": false,
"events": [
{
"eventDatas": [
{
"$type": "DiscardFrameEventData"
}
],
"startTime": 29.299814224243164,
"endTime": 33.83555221557617
}
]
}
]
}
}
},
{
"$type": "MotionSamplingRule"
}
]
}
}
]
}

@ -0,0 +1,41 @@
{
"values": [
{
"$type": "MotionGroup",
"name": "run1",
"selectedRootBone": "RootNode.root",
"id": "{12953346-AF3A-5481-A54F-9119523C4538}",
"rules": {
"rules": [
{
"$type": "EMotionFX::Pipeline::Rule::MotionMetaDataRule",
"data": {
"motionEventTable": {
"tracks": [
{
"name": "Sync",
"deletable": false,
"events": [
{
"eventDatas": [
{
"$type": "DiscardFrameEventData"
}
],
"startTime": 23.600000381469727,
"endTime": 27.933334350585938
}
]
}
]
}
}
},
{
"$type": "MotionSamplingRule"
}
]
}
}
]
}

@ -0,0 +1,50 @@
{
"values": [
{
"$type": "MotionGroup",
"name": "runpivotturn1",
"selectedRootBone": "RootNode.root",
"id": "{DEF0D469-00AB-57D0-AF05-6AF1D6563D4A}",
"rules": {
"rules": [
{
"$type": "EMotionFX::Pipeline::Rule::MotionMetaDataRule",
"data": {
"motionEventTable": {
"tracks": [
{
"name": "Sync",
"deletable": false,
"events": [
{
"eventDatas": [
{
"$type": "DiscardFrameEventData"
}
],
"startTime": 25.080034255981445,
"endTime": 28.964109420776367
},
{
"eventDatas": [
{
"$type": "DiscardFrameEventData"
}
],
"startTime": 29.574464797973633,
"endTime": 36.288368225097656
}
]
}
]
}
}
},
{
"$type": "MotionSamplingRule"
}
]
}
}
]
}

@ -0,0 +1,50 @@
{
"values": [
{
"$type": "MotionGroup",
"name": "snake1",
"selectedRootBone": "RootNode.root",
"id": "{4A29F10E-0083-559F-A78B-282A9EF87E00}",
"rules": {
"rules": [
{
"$type": "EMotionFX::Pipeline::Rule::MotionMetaDataRule",
"data": {
"motionEventTable": {
"tracks": [
{
"name": "Sync",
"deletable": false,
"events": [
{
"eventDatas": [
{
"$type": "DiscardFrameEventData"
}
],
"startTime": 34.46666717529297,
"endTime": 38.733333587646484
},
{
"eventDatas": [
{
"$type": "DiscardFrameEventData"
}
],
"startTime": 89.4000015258789,
"endTime": 90.86666870117188
}
]
}
]
}
}
},
{
"$type": "MotionSamplingRule"
}
]
}
}
]
}

@ -0,0 +1,106 @@
<ObjectStream version="3">
<Class name="SceneManifest" version="1" type="{9274AD17-3212-4651-9F3B-7DCCB080E467}">
<Class name="AZStd::vector" field="values" type="{5D6A7C67-11CA-59A4-829B-0B20B781B292}">
<Class name="AZStd::shared_ptr" field="element" type="{EB7522F9-0E87-55A9-A191-E924DC5AE867}">
<Class name="MotionGroup" field="element" version="3" type="{1B0ABB1E-F6DF-4534-9A35-2DD8244BF58C}">
<Class name="IMotionGroup" field="BaseClass1" version="1" type="{1CA400A8-2C3E-423D-B8A3-C457EF88E533}">
<Class name="IGroup" field="BaseClass1" version="1" type="{DE008E67-790D-4672-A73A-5CA0F31EDD2D}">
<Class name="IManifestObject" field="BaseClass1" type="{3B839407-1884-4FF4-ABEA-CA9D347E83F7}"/>
</Class>
</Class>
<Class name="AZStd::string" field="name" value="turnonspot1" type="{03AAAB3F-5C47-5A66-9EBC-D5FA4DB353C9}"/>
<Class name="AZStd::string" field="selectedRootBone" value="RootNode.root" type="{03AAAB3F-5C47-5A66-9EBC-D5FA4DB353C9}"/>
<Class name="AZ::Uuid" field="id" value="{809EA81B-7477-5249-BD68-5EC01DA46F50}" type="{E152C105-A133-4D03-BBF8-3D4B2FBA3E2A}"/>
<Class name="RuleContainer" field="rules" version="1" type="{2C20D3DF-57FF-4A31-8680-A4D45302B9CF}">
<Class name="AZStd::vector" field="rules" type="{B5BDB053-178F-5D55-8663-70897A71B7C9}">
<Class name="AZStd::shared_ptr" field="element" type="{0BB4AFBA-F087-55C7-95DF-01D71F6CB052}">
<Class name="CoordinateSystemRule" field="element" version="1" type="{603207E2-4F55-4C33-9AAB-98CA75C1E351}">
<Class name="IRule" field="BaseClass1" version="1" type="{81267F8B-3963-423B-9FF7-D276D82CD110}">
<Class name="IManifestObject" field="BaseClass1" type="{3B839407-1884-4FF4-ABEA-CA9D347E83F7}"/>
</Class>
<Class name="int" field="targetCoordinateSystem" value="0" type="{72039442-EB38-4D42-A1AD-CB68F7E0EEF6}"/>
</Class>
</Class>
<Class name="AZStd::shared_ptr" field="element" type="{0BB4AFBA-F087-55C7-95DF-01D71F6CB052}">
<Class name="MetaDataRule" field="element" version="2" type="{8D759063-7D2E-4543-8EB3-AB510A5886CF}">
<Class name="IManifestObject" field="BaseClass1" type="{3B839407-1884-4FF4-ABEA-CA9D347E83F7}"/>
<Class name="AZStd::vector" field="commands" type="{C9984A24-DA9E-518F-9F81-27E51FAEB1F7}">
<Class name="CommandSystem::CommandAdjustMotion" field="element" version="1" type="{A8977553-4011-4BEB-97C8-6AE44B07C7A8}">
<Class name="MCore::Command" field="BaseClass1" version="1" type="{49C636CE-7C0E-408A-A0F7-F7D12647EFBA}"/>
<Class name="CommandSystem::MotionIdCommandMixin" field="BaseClass2" version="1" type="{968E9513-3159-4469-B5FA-97D0920456E3}">
<Class name="int" field="motionID" value="0" type="{72039442-EB38-4D42-A1AD-CB68F7E0EEF6}"/>
</Class>
<Class name="AZStd::optional" field="dirtyFlag" type="{0170062C-2E7E-5CEB-BAB8-F7663BEF7B3E}"/>
<Class name="AZStd::optional" field="motionExtractionFlags" type="{43BA4537-CBCC-54F5-B403-84188A203D60}">
<Class name="unsigned char" field="element" value="0" type="{72B9409A-7D1A-4831-9CFE-FCB3FADD3426}"/>
</Class>
<Class name="AZStd::optional" field="name" type="{B0D91084-263A-54B9-A4F3-7C5F4240E248}"/>
</Class>
<Class name="CommandSystem::CommandCreateMotionEventTrack" field="element" version="1" type="{961F762D-5B90-4E21-8692-9FADDCA54E6C}">
<Class name="MCore::Command" field="BaseClass1" version="1" type="{49C636CE-7C0E-408A-A0F7-F7D12647EFBA}"/>
<Class name="CommandSystem::MotionIdCommandMixin" field="BaseClass2" version="1" type="{968E9513-3159-4469-B5FA-97D0920456E3}">
<Class name="int" field="motionID" value="0" type="{72039442-EB38-4D42-A1AD-CB68F7E0EEF6}"/>
</Class>
<Class name="AZStd::string" field="eventTrackName" value="Sync" type="{03AAAB3F-5C47-5A66-9EBC-D5FA4DB353C9}"/>
<Class name="AZStd::optional" field="eventTrackIndex" type="{5BC30B08-5E9C-5B73-BDE4-4BE8170C21C6}"/>
<Class name="AZStd::optional" field="isEnabled" type="{0170062C-2E7E-5CEB-BAB8-F7663BEF7B3E}"/>
</Class>
<Class name="CommandSystem::CommandCreateMotionEventTrack" field="element" version="1" type="{961F762D-5B90-4E21-8692-9FADDCA54E6C}">
<Class name="MCore::Command" field="BaseClass1" version="1" type="{49C636CE-7C0E-408A-A0F7-F7D12647EFBA}"/>
<Class name="CommandSystem::MotionIdCommandMixin" field="BaseClass2" version="1" type="{968E9513-3159-4469-B5FA-97D0920456E3}">
<Class name="int" field="motionID" value="0" type="{72039442-EB38-4D42-A1AD-CB68F7E0EEF6}"/>
</Class>
<Class name="AZStd::string" field="eventTrackName" value="MM Discards" type="{03AAAB3F-5C47-5A66-9EBC-D5FA4DB353C9}"/>
<Class name="AZStd::optional" field="eventTrackIndex" type="{5BC30B08-5E9C-5B73-BDE4-4BE8170C21C6}"/>
<Class name="AZStd::optional" field="isEnabled" type="{0170062C-2E7E-5CEB-BAB8-F7663BEF7B3E}"/>
</Class>
<Class name="CommandSystem::CommandCreateMotionEvent" field="element" type="{D19C2AFB-A5AA-4367-BCFC-02EB88C1B61F}">
<Class name="MCore::Command" field="BaseClass1" version="1" type="{49C636CE-7C0E-408A-A0F7-F7D12647EFBA}"/>
<Class name="CommandSystem::MotionIdCommandMixin" field="BaseClass2" version="1" type="{968E9513-3159-4469-B5FA-97D0920456E3}">
<Class name="int" field="motionID" value="0" type="{72039442-EB38-4D42-A1AD-CB68F7E0EEF6}"/>
</Class>
<Class name="AZStd::string" field="eventTrackName" value="MM Discards" type="{03AAAB3F-5C47-5A66-9EBC-D5FA4DB353C9}"/>
<Class name="float" field="startTime" value="0.0000000" type="{EA2C3E90-AFBE-44D4-A90D-FAAF79BAF93D}"/>
<Class name="float" field="endTime" value="1.7333330" type="{EA2C3E90-AFBE-44D4-A90D-FAAF79BAF93D}"/>
<Class name="AZStd::optional" field="eventDatas" type="{C69F2148-2DBF-5C3D-8154-447C6E99D041}">
<Class name="AZStd::vector" field="element" type="{4918F644-6F31-5DC0-92CC-471D16563C8A}">
<Class name="AZStd::shared_ptr" field="element" type="{FA64741F-5600-53BC-9B56-A81DD4938FF5}">
<Class name="MotionMatchEventData" field="element" version="1" type="{25499823-E611-4958-85B7-476BC1918744}">
<Class name="EventData" field="BaseClass1" version="1" type="{F6AFCD3B-D58E-4821-9E7C-D1F437304E5D}"/>
<Class name="AZStd::string" field="tag" value="discard" type="{03AAAB3F-5C47-5A66-9EBC-D5FA4DB353C9}"/>
</Class>
</Class>
</Class>
</Class>
</Class>
<Class name="CommandSystem::CommandCreateMotionEvent" field="element" type="{D19C2AFB-A5AA-4367-BCFC-02EB88C1B61F}">
<Class name="MCore::Command" field="BaseClass1" version="1" type="{49C636CE-7C0E-408A-A0F7-F7D12647EFBA}"/>
<Class name="CommandSystem::MotionIdCommandMixin" field="BaseClass2" version="1" type="{968E9513-3159-4469-B5FA-97D0920456E3}">
<Class name="int" field="motionID" value="0" type="{72039442-EB38-4D42-A1AD-CB68F7E0EEF6}"/>
</Class>
<Class name="AZStd::string" field="eventTrackName" value="MM Discards" type="{03AAAB3F-5C47-5A66-9EBC-D5FA4DB353C9}"/>
<Class name="float" field="startTime" value="63.0666656" type="{EA2C3E90-AFBE-44D4-A90D-FAAF79BAF93D}"/>
<Class name="float" field="endTime" value="86.0666656" type="{EA2C3E90-AFBE-44D4-A90D-FAAF79BAF93D}"/>
<Class name="AZStd::optional" field="eventDatas" type="{C69F2148-2DBF-5C3D-8154-447C6E99D041}">
<Class name="AZStd::vector" field="element" type="{4918F644-6F31-5DC0-92CC-471D16563C8A}">
<Class name="AZStd::shared_ptr" field="element" type="{FA64741F-5600-53BC-9B56-A81DD4938FF5}">
<Class name="MotionMatchEventData" field="element" version="1" type="{25499823-E611-4958-85B7-476BC1918744}">
<Class name="EventData" field="BaseClass1" version="1" type="{F6AFCD3B-D58E-4821-9E7C-D1F437304E5D}"/>
<Class name="AZStd::string" field="tag" value="discard" type="{03AAAB3F-5C47-5A66-9EBC-D5FA4DB353C9}"/>
</Class>
</Class>
</Class>
</Class>
</Class>
</Class>
<Class name="AZStd::string" field="metaData" value="" type="{03AAAB3F-5C47-5A66-9EBC-D5FA4DB353C9}"/>
</Class>
</Class>
</Class>
</Class>
</Class>
</Class>
</Class>
</Class>
</ObjectStream>

@ -0,0 +1,59 @@
{
"values": [
{
"$type": "MotionGroup",
"name": "walk1",
"selectedRootBone": "RootNode.root",
"id": "{B161FB42-0EC0-51DA-BB0E-F04B73C0DE0C}",
"rules": {
"rules": [
{
"$type": "EMotionFX::Pipeline::Rule::MotionMetaDataRule",
"data": {
"motionEventTable": {
"tracks": [
{
"name": "Sync",
"deletable": false,
"events": [
{
"eventDatas": [
{
"$type": "DiscardFrameEventData"
}
],
"startTime": 11.533333778381348,
"endTime": 12.533333778381348
},
{
"eventDatas": [
{
"$type": "DiscardFrameEventData"
}
],
"startTime": 35.20000076293945,
"endTime": 37.53333282470703
},
{
"eventDatas": [
{
"$type": "DiscardFrameEventData"
}
],
"startTime": 76.33333587646484,
"endTime": 93.66667175292969
}
]
}
]
}
}
},
{
"$type": "MotionSamplingRule"
}
]
}
}
]
}

@ -0,0 +1,50 @@
{
"values": [
{
"$type": "MotionGroup",
"name": "walk2",
"selectedRootBone": "RootNode.root",
"id": "{D4D1809B-5085-59E4-B98C-D29AE1A90277}",
"rules": {
"rules": [
{
"$type": "EMotionFX::Pipeline::Rule::MotionMetaDataRule",
"data": {
"motionEventTable": {
"tracks": [
{
"name": "Sync",
"deletable": false,
"events": [
{
"eventDatas": [
{
"$type": "DiscardFrameEventData"
}
],
"startTime": 37.599998474121094,
"endTime": 40.266666412353516
},
{
"eventDatas": [
{
"$type": "DiscardFrameEventData"
}
],
"startTime": 81.26667022705078,
"endTime": 84.0666732788086
}
]
}
]
}
}
},
{
"$type": "MotionSamplingRule"
}
]
}
}
]
}

@ -0,0 +1,106 @@
<ObjectStream version="3">
<Class name="SceneManifest" version="1" type="{9274AD17-3212-4651-9F3B-7DCCB080E467}">
<Class name="AZStd::vector" field="values" type="{5D6A7C67-11CA-59A4-829B-0B20B781B292}">
<Class name="AZStd::shared_ptr" field="element" type="{EB7522F9-0E87-55A9-A191-E924DC5AE867}">
<Class name="MotionGroup" field="element" version="3" type="{1B0ABB1E-F6DF-4534-9A35-2DD8244BF58C}">
<Class name="IMotionGroup" field="BaseClass1" version="1" type="{1CA400A8-2C3E-423D-B8A3-C457EF88E533}">
<Class name="IGroup" field="BaseClass1" version="1" type="{DE008E67-790D-4672-A73A-5CA0F31EDD2D}">
<Class name="IManifestObject" field="BaseClass1" type="{3B839407-1884-4FF4-ABEA-CA9D347E83F7}"/>
</Class>
</Class>
<Class name="AZStd::string" field="name" value="walkpivotturn1" type="{03AAAB3F-5C47-5A66-9EBC-D5FA4DB353C9}"/>
<Class name="AZStd::string" field="selectedRootBone" value="RootNode.root" type="{03AAAB3F-5C47-5A66-9EBC-D5FA4DB353C9}"/>
<Class name="AZ::Uuid" field="id" value="{575C0159-F151-596D-BCFB-2F92CF7DB9A1}" type="{E152C105-A133-4D03-BBF8-3D4B2FBA3E2A}"/>
<Class name="RuleContainer" field="rules" version="1" type="{2C20D3DF-57FF-4A31-8680-A4D45302B9CF}">
<Class name="AZStd::vector" field="rules" type="{B5BDB053-178F-5D55-8663-70897A71B7C9}">
<Class name="AZStd::shared_ptr" field="element" type="{0BB4AFBA-F087-55C7-95DF-01D71F6CB052}">
<Class name="CoordinateSystemRule" field="element" version="1" type="{603207E2-4F55-4C33-9AAB-98CA75C1E351}">
<Class name="IRule" field="BaseClass1" version="1" type="{81267F8B-3963-423B-9FF7-D276D82CD110}">
<Class name="IManifestObject" field="BaseClass1" type="{3B839407-1884-4FF4-ABEA-CA9D347E83F7}"/>
</Class>
<Class name="int" field="targetCoordinateSystem" value="0" type="{72039442-EB38-4D42-A1AD-CB68F7E0EEF6}"/>
</Class>
</Class>
<Class name="AZStd::shared_ptr" field="element" type="{0BB4AFBA-F087-55C7-95DF-01D71F6CB052}">
<Class name="MetaDataRule" field="element" version="2" type="{8D759063-7D2E-4543-8EB3-AB510A5886CF}">
<Class name="IManifestObject" field="BaseClass1" type="{3B839407-1884-4FF4-ABEA-CA9D347E83F7}"/>
<Class name="AZStd::vector" field="commands" type="{C9984A24-DA9E-518F-9F81-27E51FAEB1F7}">
<Class name="CommandSystem::CommandAdjustMotion" field="element" version="1" type="{A8977553-4011-4BEB-97C8-6AE44B07C7A8}">
<Class name="MCore::Command" field="BaseClass1" version="1" type="{49C636CE-7C0E-408A-A0F7-F7D12647EFBA}"/>
<Class name="CommandSystem::MotionIdCommandMixin" field="BaseClass2" version="1" type="{968E9513-3159-4469-B5FA-97D0920456E3}">
<Class name="int" field="motionID" value="0" type="{72039442-EB38-4D42-A1AD-CB68F7E0EEF6}"/>
</Class>
<Class name="AZStd::optional" field="dirtyFlag" type="{0170062C-2E7E-5CEB-BAB8-F7663BEF7B3E}"/>
<Class name="AZStd::optional" field="motionExtractionFlags" type="{43BA4537-CBCC-54F5-B403-84188A203D60}">
<Class name="unsigned char" field="element" value="0" type="{72B9409A-7D1A-4831-9CFE-FCB3FADD3426}"/>
</Class>
<Class name="AZStd::optional" field="name" type="{B0D91084-263A-54B9-A4F3-7C5F4240E248}"/>
</Class>
<Class name="CommandSystem::CommandCreateMotionEventTrack" field="element" version="1" type="{961F762D-5B90-4E21-8692-9FADDCA54E6C}">
<Class name="MCore::Command" field="BaseClass1" version="1" type="{49C636CE-7C0E-408A-A0F7-F7D12647EFBA}"/>
<Class name="CommandSystem::MotionIdCommandMixin" field="BaseClass2" version="1" type="{968E9513-3159-4469-B5FA-97D0920456E3}">
<Class name="int" field="motionID" value="0" type="{72039442-EB38-4D42-A1AD-CB68F7E0EEF6}"/>
</Class>
<Class name="AZStd::string" field="eventTrackName" value="Sync" type="{03AAAB3F-5C47-5A66-9EBC-D5FA4DB353C9}"/>
<Class name="AZStd::optional" field="eventTrackIndex" type="{5BC30B08-5E9C-5B73-BDE4-4BE8170C21C6}"/>
<Class name="AZStd::optional" field="isEnabled" type="{0170062C-2E7E-5CEB-BAB8-F7663BEF7B3E}"/>
</Class>
<Class name="CommandSystem::CommandCreateMotionEventTrack" field="element" version="1" type="{961F762D-5B90-4E21-8692-9FADDCA54E6C}">
<Class name="MCore::Command" field="BaseClass1" version="1" type="{49C636CE-7C0E-408A-A0F7-F7D12647EFBA}"/>
<Class name="CommandSystem::MotionIdCommandMixin" field="BaseClass2" version="1" type="{968E9513-3159-4469-B5FA-97D0920456E3}">
<Class name="int" field="motionID" value="0" type="{72039442-EB38-4D42-A1AD-CB68F7E0EEF6}"/>
</Class>
<Class name="AZStd::string" field="eventTrackName" value="MM Discards" type="{03AAAB3F-5C47-5A66-9EBC-D5FA4DB353C9}"/>
<Class name="AZStd::optional" field="eventTrackIndex" type="{5BC30B08-5E9C-5B73-BDE4-4BE8170C21C6}"/>
<Class name="AZStd::optional" field="isEnabled" type="{0170062C-2E7E-5CEB-BAB8-F7663BEF7B3E}"/>
</Class>
<Class name="CommandSystem::CommandCreateMotionEvent" field="element" type="{D19C2AFB-A5AA-4367-BCFC-02EB88C1B61F}">
<Class name="MCore::Command" field="BaseClass1" version="1" type="{49C636CE-7C0E-408A-A0F7-F7D12647EFBA}"/>
<Class name="CommandSystem::MotionIdCommandMixin" field="BaseClass2" version="1" type="{968E9513-3159-4469-B5FA-97D0920456E3}">
<Class name="int" field="motionID" value="0" type="{72039442-EB38-4D42-A1AD-CB68F7E0EEF6}"/>
</Class>
<Class name="AZStd::string" field="eventTrackName" value="MM Discards" type="{03AAAB3F-5C47-5A66-9EBC-D5FA4DB353C9}"/>
<Class name="float" field="startTime" value="0.0000000" type="{EA2C3E90-AFBE-44D4-A90D-FAAF79BAF93D}"/>
<Class name="float" field="endTime" value="0.5569170" type="{EA2C3E90-AFBE-44D4-A90D-FAAF79BAF93D}"/>
<Class name="AZStd::optional" field="eventDatas" type="{C69F2148-2DBF-5C3D-8154-447C6E99D041}">
<Class name="AZStd::vector" field="element" type="{4918F644-6F31-5DC0-92CC-471D16563C8A}">
<Class name="AZStd::shared_ptr" field="element" type="{FA64741F-5600-53BC-9B56-A81DD4938FF5}">
<Class name="MotionMatchEventData" field="element" version="1" type="{25499823-E611-4958-85B7-476BC1918744}">
<Class name="EventData" field="BaseClass1" version="1" type="{F6AFCD3B-D58E-4821-9E7C-D1F437304E5D}"/>
<Class name="AZStd::string" field="tag" value="discard" type="{03AAAB3F-5C47-5A66-9EBC-D5FA4DB353C9}"/>
</Class>
</Class>
</Class>
</Class>
</Class>
<Class name="CommandSystem::CommandCreateMotionEvent" field="element" type="{D19C2AFB-A5AA-4367-BCFC-02EB88C1B61F}">
<Class name="MCore::Command" field="BaseClass1" version="1" type="{49C636CE-7C0E-408A-A0F7-F7D12647EFBA}"/>
<Class name="CommandSystem::MotionIdCommandMixin" field="BaseClass2" version="1" type="{968E9513-3159-4469-B5FA-97D0920456E3}">
<Class name="int" field="motionID" value="0" type="{72039442-EB38-4D42-A1AD-CB68F7E0EEF6}"/>
</Class>
<Class name="AZStd::string" field="eventTrackName" value="MM Discards" type="{03AAAB3F-5C47-5A66-9EBC-D5FA4DB353C9}"/>
<Class name="float" field="startTime" value="43.9500351" type="{EA2C3E90-AFBE-44D4-A90D-FAAF79BAF93D}"/>
<Class name="float" field="endTime" value="47.1987190" type="{EA2C3E90-AFBE-44D4-A90D-FAAF79BAF93D}"/>
<Class name="AZStd::optional" field="eventDatas" type="{C69F2148-2DBF-5C3D-8154-447C6E99D041}">
<Class name="AZStd::vector" field="element" type="{4918F644-6F31-5DC0-92CC-471D16563C8A}">
<Class name="AZStd::shared_ptr" field="element" type="{FA64741F-5600-53BC-9B56-A81DD4938FF5}">
<Class name="MotionMatchEventData" field="element" version="1" type="{25499823-E611-4958-85B7-476BC1918744}">
<Class name="EventData" field="BaseClass1" version="1" type="{F6AFCD3B-D58E-4821-9E7C-D1F437304E5D}"/>
<Class name="AZStd::string" field="tag" value="discard" type="{03AAAB3F-5C47-5A66-9EBC-D5FA4DB353C9}"/>
</Class>
</Class>
</Class>
</Class>
</Class>
</Class>
<Class name="AZStd::string" field="metaData" value="" type="{03AAAB3F-5C47-5A66-9EBC-D5FA4DB353C9}"/>
</Class>
</Class>
</Class>
</Class>
</Class>
</Class>
</Class>
</Class>
</ObjectStream>

@ -0,0 +1,41 @@
{
"values": [
{
"$type": "MotionGroup",
"name": "walkstopturn1",
"selectedRootBone": "RootNode.root",
"id": "{D38F0C22-1841-5EBB-A198-9D9441CD7C80}",
"rules": {
"rules": [
{
"$type": "EMotionFX::Pipeline::Rule::MotionMetaDataRule",
"data": {
"motionEventTable": {
"tracks": [
{
"name": "Sync",
"deletable": false,
"events": [
{
"eventDatas": [
{
"$type": "DiscardFrameEventData"
}
],
"startTime": 63.53333282470703,
"endTime": 70.53333282470703
}
]
}
]
}
}
},
{
"$type": "MotionSamplingRule"
}
]
}
}
]
}

@ -0,0 +1,41 @@
{
"values": [
{
"$type": "MotionGroup",
"name": "walkstopturnpivot1",
"selectedRootBone": "RootNode.root",
"id": "{FCD5DF16-A875-552E-9333-3C7BF8554CBB}",
"rules": {
"rules": [
{
"$type": "EMotionFX::Pipeline::Rule::MotionMetaDataRule",
"data": {
"motionEventTable": {
"tracks": [
{
"name": "Sync",
"deletable": false,
"events": [
{
"eventDatas": [
{
"$type": "DiscardFrameEventData"
}
],
"startTime": 53.06666564941406,
"endTime": 55.86666488647461
}
]
}
]
}
}
},
{
"$type": "MotionSamplingRule"
}
]
}
}
]
}

@ -0,0 +1,50 @@
{
"values": [
{
"$type": "MotionGroup",
"name": "walkturns1",
"selectedRootBone": "RootNode.root",
"id": "{FD1981AB-0270-56F7-9062-ABA4D73686F9}",
"rules": {
"rules": [
{
"$type": "EMotionFX::Pipeline::Rule::MotionMetaDataRule",
"data": {
"motionEventTable": {
"tracks": [
{
"name": "Sync",
"deletable": false,
"events": [
{
"eventDatas": [
{
"$type": "DiscardFrameEventData"
}
],
"startTime": 32.266666412353516,
"endTime": 51.33333206176758
},
{
"eventDatas": [
{
"$type": "DiscardFrameEventData"
}
],
"startTime": 86.86666870117188,
"endTime": 89.46666717529297
}
]
}
]
}
}
},
{
"$type": "MotionSamplingRule"
}
]
}
}
]
}

@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:d38ee57dcf86d1209982ffa29cf5a2b42f5e0e0b86f5045a8485e3e431d74b03
size 12267120

File diff suppressed because one or more lines are too long

@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:09cc2374b99c219421812ec39b68a350912d44777a50e3078c8cd67e91cc74cf
size 30927

@ -0,0 +1,3 @@
[General]
version=1
startScript="ImportActor -filename \"Character/RinMM.actor\"\nCreateActorInstance -actorID %LASTRESULT% -xPos 4.585605 -yPos -7.166286 -zPos 0.000000 -xScale 1.000000 -yScale 1.000000 -zScale 1.000000 -rot 0.00000000,0.00000000,0.98711985,-0.15998250\nLoadMotionSet -filename \"@products@/MotionMatching.motionset\"\nLoadAnimGraph -filename \"@products@/MotionMatching.animgraph\"\nActivateAnimGraph -actorInstanceID %LASTRESULT3% -animGraphID %LASTRESULT1% -motionSetID %LASTRESULT2% -visualizeScale 1.000000\n"

@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:9c895a1696f5f37492089ceaef11210d704c1fb7e3dd31305cf4732d63489507
size 13645

@ -0,0 +1,16 @@
#
# Copyright (c) Contributors to the Open 3D Engine Project.
# For complete copyright and license terms please see the LICENSE at the root of this distribution.
#
# SPDX-License-Identifier: Apache-2.0 OR MIT
#
#
set(o3de_gem_path ${CMAKE_CURRENT_LIST_DIR})
set(o3de_gem_json ${o3de_gem_path}/gem.json)
o3de_read_json_key(o3de_gem_name ${o3de_gem_json} "gem_name")
o3de_restricted_path(${o3de_gem_json} o3de_gem_restricted_path)
ly_add_external_target_path(${CMAKE_CURRENT_LIST_DIR}/3rdParty)
add_subdirectory(Code)

@ -0,0 +1,155 @@
#
# Copyright (c) Contributors to the Open 3D Engine Project.
# For complete copyright and license terms please see the LICENSE at the root of this distribution.
#
# SPDX-License-Identifier: Apache-2.0 OR MIT
#
#
# Add the MotionMatching.Static target
ly_add_target(
NAME MotionMatching.Static STATIC
NAMESPACE Gem
FILES_CMAKE
motionmatching_files.cmake
INCLUDE_DIRECTORIES
PUBLIC
Include
PRIVATE
Source
BUILD_DEPENDENCIES
PUBLIC
AZ::AzCore
AZ::AzFramework
Gem::EMotionFXStaticLib
Gem::ImguiAtom.Static
)
# Here add MotionMatching target, it depends on the MotionMatching.Static
ly_add_target(
NAME MotionMatching ${PAL_TRAIT_MONOLITHIC_DRIVEN_MODULE_TYPE}
NAMESPACE Gem
FILES_CMAKE
motionmatching_shared_files.cmake
INCLUDE_DIRECTORIES
PUBLIC
Include
PRIVATE
Source
BUILD_DEPENDENCIES
PRIVATE
Gem::MotionMatching.Static
Gem::ImGui.Static
Gem::ImGui.ImGuiLYUtils
)
# By default, we will specify that the above target MotionMatching would be used by
# Client and Server type targets when this gem is enabled. If you don't want it
# active in Clients or Servers by default, delete one of both of the following lines:
ly_create_alias(NAME MotionMatching.Clients NAMESPACE Gem TARGETS Gem::MotionMatching)
ly_create_alias(NAME MotionMatching.Servers NAMESPACE Gem TARGETS Gem::MotionMatching)
# If we are on a host platform, we want to add the host tools targets like the MotionMatching.Editor target which
# will also depend on MotionMatching.Static
if(PAL_TRAIT_BUILD_HOST_TOOLS)
ly_add_target(
NAME MotionMatching.Editor.Static STATIC
NAMESPACE Gem
FILES_CMAKE
motionmatching_editor_files.cmake
INCLUDE_DIRECTORIES
PRIVATE
Source
PUBLIC
Include
BUILD_DEPENDENCIES
PUBLIC
AZ::AzToolsFramework
Gem::MotionMatching.Static
)
ly_add_target(
NAME MotionMatching.Editor GEM_MODULE
NAMESPACE Gem
AUTOMOC
OUTPUT_NAME Gem.MotionMatching.Editor
FILES_CMAKE
motionmatching_editor_shared_files.cmake
INCLUDE_DIRECTORIES
PRIVATE
Source
PUBLIC
Include
BUILD_DEPENDENCIES
PUBLIC
Gem::MotionMatching.Editor.Static
)
# By default, we will specify that the above target MotionMatching would be used by
# Tool and Builder type targets when this gem is enabled. If you don't want it
# active in Tools or Builders by default, delete one of both of the following lines:
ly_create_alias(NAME MotionMatching.Tools NAMESPACE Gem TARGETS Gem::MotionMatching.Editor)
ly_create_alias(NAME MotionMatching.Builders NAMESPACE Gem TARGETS Gem::MotionMatching.Editor)
endif()
################################################################################
# Tests
################################################################################
# See if globally, tests are supported
if(PAL_TRAIT_BUILD_TESTS_SUPPORTED)
# We globally support tests, see if we support tests on this platform for MotionMatching.Static
if(PAL_TRAIT_MOTIONMATCHING_TEST_SUPPORTED)
# We support MotionMatching.Tests on this platform, add MotionMatching.Tests target which depends on MotionMatching.Static
ly_add_target(
NAME MotionMatching.Tests ${PAL_TRAIT_TEST_TARGET_TYPE}
NAMESPACE Gem
FILES_CMAKE
motionmatching_files.cmake
motionmatching_tests_files.cmake
INCLUDE_DIRECTORIES
PRIVATE
Tests
Source
BUILD_DEPENDENCIES
PRIVATE
AZ::AzTest
AZ::AzFramework
Gem::EMotionFX.Tests.Static
Gem::MotionMatching.Static
)
# Add MotionMatching.Tests to googletest
ly_add_googletest(
NAME Gem::MotionMatching.Tests
)
endif()
# If we are a host platform we want to add tools test like editor tests here
if(PAL_TRAIT_BUILD_HOST_TOOLS)
# We are a host platform, see if Editor tests are supported on this platform
if(PAL_TRAIT_MOTIONMATCHING_EDITOR_TEST_SUPPORTED)
# We support MotionMatching.Editor.Tests on this platform, add MotionMatching.Editor.Tests target which depends on MotionMatching.Editor
ly_add_target(
NAME MotionMatching.Editor.Tests ${PAL_TRAIT_TEST_TARGET_TYPE}
NAMESPACE Gem
FILES_CMAKE
motionmatching_editor_tests_files.cmake
INCLUDE_DIRECTORIES
PRIVATE
Tests
Source
BUILD_DEPENDENCIES
PRIVATE
AZ::AzTest
Gem::MotionMatching.Editor
)
# Add MotionMatching.Editor.Tests to googletest
ly_add_googletest(
NAME Gem::MotionMatching.Editor.Tests
)
endif()
endif()
endif()

@ -0,0 +1,38 @@
/*
* Copyright (c) Contributors to the Open 3D Engine Project.
* For complete copyright and license terms please see the LICENSE at the root of this distribution.
*
* SPDX-License-Identifier: Apache-2.0 OR MIT
*
*/
#pragma once
#include <AzCore/EBus/EBus.h>
#include <AzCore/Interface/Interface.h>
namespace EMotionFX::MotionMatching
{
class MotionMatchingRequests
{
public:
AZ_RTTI(MotionMatchingRequests, "{b08f73cc-a922-49ef-8c0e-07166b43ea65}");
virtual ~MotionMatchingRequests() = default;
// Put your public methods here
};
class MotionMatchingBusTraits
: public AZ::EBusTraits
{
public:
//////////////////////////////////////////////////////////////////////////
// EBusTraits overrides
static constexpr AZ::EBusHandlerPolicy HandlerPolicy = AZ::EBusHandlerPolicy::Single;
static constexpr AZ::EBusAddressPolicy AddressPolicy = AZ::EBusAddressPolicy::Single;
//////////////////////////////////////////////////////////////////////////
};
using MotionMatchingRequestBus = AZ::EBus<MotionMatchingRequests, MotionMatchingBusTraits>;
using MotionMatchingInterface = AZ::Interface<MotionMatchingRequests>;
} // namespace EMotionFX::MotionMatching

@ -0,0 +1,16 @@
/*
* Copyright (c) Contributors to the Open 3D Engine Project.
* For complete copyright and license terms please see the LICENSE at the root of this distribution.
*
* SPDX-License-Identifier: Apache-2.0 OR MIT
*
*/
#pragma once
#include <AzCore/Memory/SystemAllocator.h>
namespace EMotionFX::MotionMatching
{
using MotionMatchAllocator = AZ::SystemAllocator;
} // namespace MotionMatching

@ -0,0 +1,373 @@
/*
* Copyright (c) Contributors to the Open 3D Engine Project.
* For complete copyright and license terms please see the LICENSE at the root of this distribution.
*
* SPDX-License-Identifier: Apache-2.0 OR MIT
*
*/
#include <AzCore/Serialization/EditContext.h>
#include <AzCore/Serialization/SerializeContext.h>
#include <AzCore/std/smart_ptr/make_shared.h>
#include <EMotionFX/Source/AnimGraph.h>
#include <EMotionFX/Source/AnimGraphManager.h>
#include <EMotionFX/Source/EventManager.h>
#include <EMotionFX/Source/Motion.h>
#include <BlendTreeMotionMatchNode.h>
#include <FeatureSchemaDefault.h>
#include <EMotionFX/Source/MotionSet.h>
#include <EMotionFX/Source/Node.h>
#include <EMotionFX/Source/Recorder.h>
#include <EMotionFX/Source/TransformData.h>
#include <FeaturePosition.h>
namespace EMotionFX::MotionMatching
{
AZ_CLASS_ALLOCATOR_IMPL(BlendTreeMotionMatchNode, AnimGraphAllocator, 0)
AZ_CLASS_ALLOCATOR_IMPL(BlendTreeMotionMatchNode::UniqueData, AnimGraphObjectUniqueDataAllocator, 0)
BlendTreeMotionMatchNode::BlendTreeMotionMatchNode()
: AnimGraphNode()
{
// Setup the input ports.
InitInputPorts(2);
SetupInputPort("Goal Pos", INPUTPORT_TARGETPOS, MCore::AttributeVector3::TYPE_ID, PORTID_INPUT_TARGETPOS);
SetupInputPort("Goal Facing Dir", INPUTPORT_TARGETFACINGDIR, MCore::AttributeVector3::TYPE_ID, PORTID_INPUT_TARGETFACINGDIR);
// Setup the output ports.
InitOutputPorts(1);
SetupOutputPortAsPose("Output Pose", OUTPUTPORT_POSE, PORTID_OUTPUT_POSE);
}
BlendTreeMotionMatchNode::~BlendTreeMotionMatchNode()
{
}
bool BlendTreeMotionMatchNode::InitAfterLoading(AnimGraph* animGraph)
{
if (!AnimGraphNode::InitAfterLoading(animGraph))
{
return false;
}
// Automatically register the default feature schema in case the schema is empty after loading the node.
if (m_featureSchema.GetNumFeatures() == 0)
{
AZStd::string rootJointName;
if (m_animGraph->GetNumAnimGraphInstances() > 0)
{
const Actor* actor = m_animGraph->GetAnimGraphInstance(0)->GetActorInstance()->GetActor();
const Node* rootJoint = actor->GetMotionExtractionNode();
if (rootJoint)
{
rootJointName = rootJoint->GetNameString();
}
}
DefaultFeatureSchemaInitSettings defaultSettings;
defaultSettings.m_rootJointName = rootJointName.c_str();
defaultSettings.m_leftFootJointName = "L_foot_JNT";
defaultSettings.m_rightFootJointName = "R_foot_JNT";
defaultSettings.m_pelvisJointName = "C_pelvis_JNT";
DefaultFeatureSchema(m_featureSchema, defaultSettings);
}
InitInternalAttributesForAllInstances();
Reinit();
return true;
}
const char* BlendTreeMotionMatchNode::GetPaletteName() const
{
return "Motion Matching";
}
AnimGraphObject::ECategory BlendTreeMotionMatchNode::GetPaletteCategory() const
{
return AnimGraphObject::CATEGORY_SOURCES;
}
void BlendTreeMotionMatchNode::UniqueData::Update()
{
AZ_PROFILE_SCOPE(Animation, "BlendTreeMotionMatchNode::UniqueData::Update");
auto animGraphNode = azdynamic_cast<BlendTreeMotionMatchNode*>(m_object);
AZ_Assert(animGraphNode, "Unique data linked to incorrect node type.");
ActorInstance* actorInstance = m_animGraphInstance->GetActorInstance();
// Clear existing data.
delete m_instance;
delete m_data;
m_data = aznew MotionMatching::MotionMatchingData(animGraphNode->m_featureSchema);
m_instance = aznew MotionMatching::MotionMatchingInstance();
MotionSet* motionSet = m_animGraphInstance->GetMotionSet();
if (!motionSet)
{
SetHasError(true);
return;
}
//---------------------------------
AZ::Debug::Timer timer;
timer.Stamp();
// Build a list of motions we want to import the frames from.
AZ_Printf("Motion Matching", "Importing motion database...");
MotionMatching::MotionMatchingData::InitSettings settings;
settings.m_actorInstance = actorInstance;
settings.m_frameImportSettings.m_sampleRate = animGraphNode->m_sampleRate;
settings.m_importMirrored = animGraphNode->m_mirror;
settings.m_maxKdTreeDepth = animGraphNode->m_maxKdTreeDepth;
settings.m_minFramesPerKdTreeNode = animGraphNode->m_minFramesPerKdTreeNode;
settings.m_motionList.reserve(animGraphNode->m_motionIds.size());
for (const AZStd::string& id : animGraphNode->m_motionIds)
{
Motion* motion = motionSet->RecursiveFindMotionById(id);
if (motion)
{
settings.m_motionList.emplace_back(motion);
}
else
{
AZ_Warning("Motion Matching", false, "Failed to get motion for motionset entry id '%s'", id.c_str());
}
}
// Initialize the motion matching data (slow).
AZ_Printf("Motion Matching", "Initializing motion matching...");
if (!m_data->Init(settings))
{
AZ_Warning("Motion Matching", false, "Failed to initialize motion matching for anim graph node '%s'!", animGraphNode->GetName());
SetHasError(true);
return;
}
// Initialize the instance.
AZ_Printf("Motion Matching", "Initializing instance...");
MotionMatching::MotionMatchingInstance::InitSettings initSettings;
initSettings.m_actorInstance = actorInstance;
initSettings.m_data = m_data;
m_instance->Init(initSettings);
const float initTime = timer.GetDeltaTimeInSeconds();
const size_t memUsage = m_data->GetFrameDatabase().CalcMemoryUsageInBytes();
AZ_Printf("Motion Matching", "Finished in %.2f seconds (mem usage=%d bytes or %.2f mb)", initTime, memUsage, memUsage / (float)(1024 * 1024));
//---------------------------------
SetHasError(false);
}
void BlendTreeMotionMatchNode::Update(AnimGraphInstance* animGraphInstance, float timePassedInSeconds)
{
AZ_PROFILE_SCOPE(Animation, "BlendTreeMotionMatchNode::Update");
m_timer.Stamp();
UniqueData* uniqueData = static_cast<UniqueData*>(FindOrCreateUniqueNodeData(animGraphInstance));
UpdateAllIncomingNodes(animGraphInstance, timePassedInSeconds);
uniqueData->Clear();
if (uniqueData->GetHasError())
{
m_updateTimeInMs = 0.0f;
m_postUpdateTimeInMs = 0.0f;
m_outputTimeInMs = 0.0f;
return;
}
AZ::Vector3 targetPos = AZ::Vector3::CreateZero();
TryGetInputVector3(animGraphInstance, INPUTPORT_TARGETPOS, targetPos);
AZ::Vector3 targetFacingDir = AZ::Vector3::CreateAxisY();
TryGetInputVector3(animGraphInstance, INPUTPORT_TARGETFACINGDIR, targetFacingDir);
MotionMatching::MotionMatchingInstance* instance = uniqueData->m_instance;
instance->Update(timePassedInSeconds, targetPos, targetFacingDir, m_trajectoryQueryMode, m_pathRadius, m_pathSpeed);
// set the current time to the new calculated time
uniqueData->ClearInheritFlags();
uniqueData->SetPreSyncTime(instance->GetMotionInstance()->GetCurrentTime());
uniqueData->SetCurrentPlayTime(instance->GetNewMotionTime());
if (uniqueData->GetPreSyncTime() > uniqueData->GetCurrentPlayTime())
{
uniqueData->SetPreSyncTime(uniqueData->GetCurrentPlayTime());
}
m_updateTimeInMs = m_timer.GetDeltaTimeInSeconds() * 1000.0f;
}
void BlendTreeMotionMatchNode::PostUpdate(AnimGraphInstance* animGraphInstance, float timePassedInSeconds)
{
AZ_PROFILE_SCOPE(Animation, "BlendTreeMotionMatchNode::PostUpdate");
AZ_UNUSED(animGraphInstance);
AZ_UNUSED(timePassedInSeconds);
m_timer.Stamp();
for (AZ::u32 i = 0; i < GetNumConnections(); ++i)
{
AnimGraphNode* node = GetConnection(i)->GetSourceNode();
node->PerformPostUpdate(animGraphInstance, timePassedInSeconds);
}
UniqueData* uniqueData = static_cast<UniqueData*>(FindOrCreateUniqueNodeData(animGraphInstance));
MotionMatching::MotionMatchingInstance* instance = uniqueData->m_instance;
RequestRefDatas(animGraphInstance);
AnimGraphRefCountedData* data = uniqueData->GetRefCountedData();
data->ClearEventBuffer();
data->ZeroTrajectoryDelta();
if (uniqueData->GetHasError())
{
return;
}
MotionInstance* motionInstance = instance->GetMotionInstance();
motionInstance->UpdateByTimeValues(uniqueData->GetPreSyncTime(), uniqueData->GetCurrentPlayTime(), &data->GetEventBuffer());
uniqueData->SetCurrentPlayTime(motionInstance->GetCurrentTime());
data->GetEventBuffer().UpdateEmitters(this);
instance->PostUpdate(timePassedInSeconds);
const Transform& trajectoryDelta = instance->GetMotionExtractionDelta();
data->SetTrajectoryDelta(trajectoryDelta);
data->SetTrajectoryDeltaMirrored(trajectoryDelta); // TODO: use a real mirrored version here.
m_postUpdateTimeInMs = m_timer.GetDeltaTimeInSeconds() * 1000.0f;
}
void BlendTreeMotionMatchNode::Output(AnimGraphInstance* animGraphInstance)
{
AZ_PROFILE_SCOPE(Animation, "BlendTreeMotionMatchNode::Output");
AZ_UNUSED(animGraphInstance);
m_timer.Stamp();
AnimGraphPose* outputPose;
// Initialize to bind pose.
ActorInstance* actorInstance = animGraphInstance->GetActorInstance();
RequestPoses(animGraphInstance);
outputPose = GetOutputPose(animGraphInstance, OUTPUTPORT_POSE)->GetValue();
outputPose->InitFromBindPose(actorInstance);
if (m_disabled)
{
return;
}
UniqueData* uniqueData = static_cast<UniqueData*>(FindOrCreateUniqueNodeData(animGraphInstance));
if (GetEMotionFX().GetIsInEditorMode())
{
SetHasError(uniqueData, uniqueData->GetHasError());
}
if (uniqueData->GetHasError())
{
return;
}
OutputIncomingNode(animGraphInstance, GetInputNode(INPUTPORT_TARGETPOS));
OutputIncomingNode(animGraphInstance, GetInputNode(INPUTPORT_TARGETFACINGDIR));
MotionMatching::MotionMatchingInstance* instance = uniqueData->m_instance;
instance->SetLowestCostSearchFrequency(m_lowestCostSearchFrequency);
Pose& outTransformPose = outputPose->GetPose();
instance->Output(outTransformPose);
// Performance metrics
m_outputTimeInMs = m_timer.GetDeltaTimeInSeconds() * 1000.0f;
{
//AZ_Printf("MotionMatch", "Update = %.2f, PostUpdate = %.2f, Output = %.2f", m_updateTime, m_postUpdateTime, m_outputTime);
#ifdef IMGUI_ENABLED
ImGuiMonitorRequestBus::Broadcast(&ImGuiMonitorRequests::PushPerformanceHistogramValue, "Update", m_updateTimeInMs);
ImGuiMonitorRequestBus::Broadcast(&ImGuiMonitorRequests::PushPerformanceHistogramValue, "Post Update", m_postUpdateTimeInMs);
ImGuiMonitorRequestBus::Broadcast(&ImGuiMonitorRequests::PushPerformanceHistogramValue, "Output", m_outputTimeInMs);
#endif
}
instance->DebugDraw();
}
void BlendTreeMotionMatchNode::Reflect(AZ::ReflectContext* context)
{
AZ::SerializeContext* serializeContext = azrtti_cast<AZ::SerializeContext*>(context);
if (!serializeContext)
{
return;
}
serializeContext->Class<BlendTreeMotionMatchNode, AnimGraphNode>()
->Version(9)
->Field("sampleRate", &BlendTreeMotionMatchNode::m_sampleRate)
->Field("lowestCostSearchFrequency", &BlendTreeMotionMatchNode::m_lowestCostSearchFrequency)
->Field("maxKdTreeDepth", &BlendTreeMotionMatchNode::m_maxKdTreeDepth)
->Field("minFramesPerKdTreeNode", &BlendTreeMotionMatchNode::m_minFramesPerKdTreeNode)
->Field("mirror", &BlendTreeMotionMatchNode::m_mirror)
->Field("controlSplineMode", &BlendTreeMotionMatchNode::m_trajectoryQueryMode)
->Field("pathRadius", &BlendTreeMotionMatchNode::m_pathRadius)
->Field("pathSpeed", &BlendTreeMotionMatchNode::m_pathSpeed)
->Field("featureSchema", &BlendTreeMotionMatchNode::m_featureSchema)
->Field("motionIds", &BlendTreeMotionMatchNode::m_motionIds)
;
AZ::EditContext* editContext = serializeContext->GetEditContext();
if (!editContext)
{
return;
}
editContext->Class<BlendTreeMotionMatchNode>("Motion Matching Node", "Motion Matching Attributes")
->ClassElement(AZ::Edit::ClassElements::EditorData, "")
->Attribute(AZ::Edit::Attributes::AutoExpand, "")
->Attribute(AZ::Edit::Attributes::Visibility, AZ::Edit::PropertyVisibility::ShowChildrenOnly)
->DataElement(AZ::Edit::UIHandlers::Default, &BlendTreeMotionMatchNode::m_sampleRate, "Feature sample rate", "The sample rate (in Hz) used for extracting the features from the animations. The higher the sample rate, the more data will be used and the more options the motion matching search has available for the best matching frame.")
->Attribute(AZ::Edit::Attributes::Min, 1)
->Attribute(AZ::Edit::Attributes::Max, 240)
->Attribute(AZ::Edit::Attributes::ChangeNotify, &BlendTreeMotionMatchNode::Reinit)
->DataElement(AZ::Edit::UIHandlers::Default, &BlendTreeMotionMatchNode::m_lowestCostSearchFrequency, "Search frequency", "How often per second we apply the motion matching search and find the lowest cost / best matching frame, and start to blend towards it.")
->Attribute(AZ::Edit::Attributes::Min, 0.001f)
->Attribute(AZ::Edit::Attributes::Max, std::numeric_limits<float>::max())
->Attribute(AZ::Edit::Attributes::Step, 0.05f)
->DataElement(AZ::Edit::UIHandlers::Default, &BlendTreeMotionMatchNode::m_maxKdTreeDepth, "Max kdTree depth", "The maximum number of hierarchy levels in the kdTree.")
->Attribute(AZ::Edit::Attributes::Min, 1)
->Attribute(AZ::Edit::Attributes::Max, 20)
->Attribute(AZ::Edit::Attributes::ChangeNotify, &BlendTreeMotionMatchNode::Reinit)
->DataElement(AZ::Edit::UIHandlers::Default, &BlendTreeMotionMatchNode::m_minFramesPerKdTreeNode, "Min kdTree node size", "The minimum number of frames to store per kdTree node.")
->Attribute(AZ::Edit::Attributes::Min, 1)
->Attribute(AZ::Edit::Attributes::Max, 100000)
->Attribute(AZ::Edit::Attributes::ChangeNotify, &BlendTreeMotionMatchNode::Reinit)
->DataElement(AZ::Edit::UIHandlers::Default, &BlendTreeMotionMatchNode::m_pathRadius, "Path radius", "")
->Attribute(AZ::Edit::Attributes::Min, 0.0001f)
->Attribute(AZ::Edit::Attributes::Max, std::numeric_limits<float>::max())
->Attribute(AZ::Edit::Attributes::Step, 0.01f)
->DataElement(AZ::Edit::UIHandlers::Default, &BlendTreeMotionMatchNode::m_pathSpeed, "Path speed", "")
->Attribute(AZ::Edit::Attributes::Min, 0.0001f)
->Attribute(AZ::Edit::Attributes::Max, std::numeric_limits<float>::max())
->Attribute(AZ::Edit::Attributes::Step, 0.01f)
->DataElement(AZ::Edit::UIHandlers::ComboBox, &BlendTreeMotionMatchNode::m_trajectoryQueryMode, "Trajectory mode", "Desired future trajectory generation mode.")
->EnumAttribute(TrajectoryQuery::MODE_TARGETDRIVEN, "Target driven")
->EnumAttribute(TrajectoryQuery::MODE_ONE, "Mode one")
->EnumAttribute(TrajectoryQuery::MODE_TWO, "Mode two")
->EnumAttribute(TrajectoryQuery::MODE_THREE, "Mode three")
->EnumAttribute(TrajectoryQuery::MODE_FOUR, "Mode four")
->DataElement(AZ::Edit::UIHandlers::Default, &BlendTreeMotionMatchNode::m_featureSchema, "FeatureSchema", "")
->Attribute(AZ::Edit::Attributes::AutoExpand, "")
->Attribute(AZ::Edit::Attributes::Visibility, AZ::Edit::PropertyVisibility::ShowChildrenOnly)
->Attribute(AZ::Edit::Attributes::ChangeNotify, &BlendTreeMotionMatchNode::Reinit)
->DataElement(AZ_CRC("MotionSetMotionIds", 0x8695c0fa), &BlendTreeMotionMatchNode::m_motionIds, "Motions", "")
->Attribute(AZ::Edit::Attributes::ChangeNotify, &BlendTreeMotionMatchNode::Reinit)
->Attribute(AZ::Edit::Attributes::ContainerCanBeModified, false)
->Attribute(AZ::Edit::Attributes::Visibility, AZ::Edit::PropertyVisibility::HideChildren)
;
}
} // namespace EMotionFX::MotionMatching

@ -0,0 +1,111 @@
/*
* Copyright (c) Contributors to the Open 3D Engine Project.
* For complete copyright and license terms please see the LICENSE at the root of this distribution.
*
* SPDX-License-Identifier: Apache-2.0 OR MIT
*
*/
#pragma once
#include <AzCore/Debug/Timer.h>
#include <EMotionFX/Source/AnimGraphNode.h>
#include <EMotionFX/Source/EMotionFXConfig.h>
#include <MotionMatchingInstance.h>
#include <FeatureSchema.h>
#include <MotionMatchingData.h>
#include <ImGuiMonitor.h>
namespace EMotionFX::MotionMatching
{
class EMFX_API BlendTreeMotionMatchNode
: public AnimGraphNode
{
public:
AZ_RTTI(BlendTreeMotionMatchNode, "{1DC80DCD-6536-4950-9260-A4615C03E3C5}", AnimGraphNode)
AZ_CLASS_ALLOCATOR_DECL
enum
{
INPUTPORT_TARGETPOS = 0,
INPUTPORT_TARGETFACINGDIR = 1,
OUTPUTPORT_POSE = 0
};
enum
{
PORTID_INPUT_TARGETPOS = 0,
PORTID_INPUT_TARGETFACINGDIR = 1,
PORTID_OUTPUT_POSE = 0
};
class EMFX_API UniqueData
: public AnimGraphNodeData
{
EMFX_ANIMGRAPHOBJECTDATA_IMPLEMENT_LOADSAVE
public:
AZ_CLASS_ALLOCATOR_DECL
UniqueData(AnimGraphNode* node, AnimGraphInstance* animGraphInstance)
: AnimGraphNodeData(node, animGraphInstance)
{
}
~UniqueData()
{
delete m_data;
delete m_instance;
}
void Update() override;
public:
MotionMatching::MotionMatchingInstance* m_instance = nullptr;
MotionMatching::MotionMatchingData* m_data = nullptr;
};
BlendTreeMotionMatchNode();
~BlendTreeMotionMatchNode();
bool InitAfterLoading(AnimGraph* animGraph) override;
bool GetSupportsVisualization() const override { return true; }
bool GetHasOutputPose() const override { return true; }
bool GetSupportsDisable() const override { return true; }
AZ::Color GetVisualColor() const override { return AZ::Colors::Green; }
AnimGraphPose* GetMainOutputPose(AnimGraphInstance* animGraphInstance) const override { return GetOutputPose(animGraphInstance, OUTPUTPORT_POSE)->GetValue(); }
const char* GetPaletteName() const override;
AnimGraphObject::ECategory GetPaletteCategory() const override;
AnimGraphObjectData* CreateUniqueData(AnimGraphInstance* animGraphInstance) override { return aznew UniqueData(this, animGraphInstance); }
static void Reflect(AZ::ReflectContext* context);
private:
void Output(AnimGraphInstance* animGraphInstance) override;
void Update(AnimGraphInstance* animGraphInstance, float timePassedInSeconds) override;
void PostUpdate(AnimGraphInstance* animGraphInstance, float timePassedInSeconds) override;
FeatureSchema m_featureSchema;
AZStd::vector<AZStd::string> m_motionIds;
float m_pathRadius = 1.0f;
float m_pathSpeed = 1.0f;
float m_lowestCostSearchFrequency = 5.0f;
AZ::u32 m_sampleRate = 30;
AZ::u32 m_maxKdTreeDepth = 15;
AZ::u32 m_minFramesPerKdTreeNode = 1000;
TrajectoryQuery::EMode m_trajectoryQueryMode = TrajectoryQuery::MODE_TARGETDRIVEN;
bool m_mirror = false;
AZ::Debug::Timer m_timer;
float m_updateTimeInMs = 0.0f;
float m_postUpdateTimeInMs = 0.0f;
float m_outputTimeInMs = 0.0f;
#ifdef IMGUI_ENABLED
ImGuiMonitor m_imguiMonitor;
#endif
};
} // namespace EMotionFX::MotionMatching

@ -0,0 +1,92 @@
/*
* Copyright (c) Contributors to the Open 3D Engine Project.
* For complete copyright and license terms please see the LICENSE at the root of this distribution.
*
* SPDX-License-Identifier: Apache-2.0 OR MIT
*
*/
#include <AzCore/Serialization/SerializeContext.h>
#include <AzCore/Serialization/EditContext.h>
#include <AzCore/StringFunc/StringFunc.h>
#include <EventData.h>
#include <Allocators.h>
namespace EMotionFX::MotionMatching
{
AZ_CLASS_ALLOCATOR_IMPL(DiscardFrameEventData, MotionEventAllocator, 0)
bool DiscardFrameEventData::Equal([[maybe_unused]]const EventData& rhs, [[maybe_unused]] bool ignoreEmptyFields) const
{
return true;
}
void DiscardFrameEventData::Reflect(AZ::ReflectContext* context)
{
AZ::SerializeContext* serializeContext = azrtti_cast<AZ::SerializeContext*>(context);
if (!serializeContext)
{
return;
}
serializeContext->Class<DiscardFrameEventData, EventData>()
->Version(1)
;
AZ::EditContext* editContext = serializeContext->GetEditContext();
if (!editContext)
{
return;
}
editContext->Class<DiscardFrameEventData>("[Motion Matching] Discard Frame", "Event used for discarding ranges of the animation..")
->ClassElement(AZ::Edit::ClassElements::EditorData, "")
->Attribute(AZ::Edit::Attributes::AutoExpand, true)
->Attribute(AZ::Edit::Attributes::Visibility, AZ::Edit::PropertyVisibility::ShowChildrenOnly)
->Attribute(AZ_CRC_CE("Creatable"), true)
;
}
///////////////////////////////////////////////////////////////////////////
AZ_CLASS_ALLOCATOR_IMPL(TagEventData, MotionEventAllocator, 0)
bool TagEventData::Equal(const EventData& rhs, [[maybe_unused]] bool ignoreEmptyFields) const
{
const TagEventData* other = azdynamic_cast<const TagEventData*>(&rhs);
if (other)
{
return AZ::StringFunc::Equal(m_tag.c_str(), other->m_tag.c_str(), /*caseSensitive=*/false);
}
return false;
}
void TagEventData::Reflect(AZ::ReflectContext* context)
{
AZ::SerializeContext* serializeContext = azrtti_cast<AZ::SerializeContext*>(context);
if (!serializeContext)
{
return;
}
serializeContext->Class<TagEventData, EventData>()
->Version(1)
->Field("tag", &TagEventData::m_tag)
;
AZ::EditContext* editContext = serializeContext->GetEditContext();
if (!editContext)
{
return;
}
editContext->Class<TagEventData>("[Motion Matching] Tag", "")
->ClassElement(AZ::Edit::ClassElements::EditorData, "")
->Attribute(AZ::Edit::Attributes::AutoExpand, true)
->Attribute(AZ::Edit::Attributes::Visibility, AZ::Edit::PropertyVisibility::ShowChildrenOnly)
->Attribute(AZ_CRC_CE("Creatable"), true)
->DataElement(AZ::Edit::UIHandlers::Default, &TagEventData::m_tag, "Tag", "The tag that should be active.")
;
}
} // namespace EMotionFX::MotionMatching

@ -0,0 +1,55 @@
/*
* Copyright (c) Contributors to the Open 3D Engine Project.
* For complete copyright and license terms please see the LICENSE at the root of this distribution.
*
* SPDX-License-Identifier: Apache-2.0 OR MIT
*
*/
#pragma once
#include <EMotionFX/Source/EventData.h>
namespace AZ
{
class ReflectContext;
}
namespace EMotionFX::MotionMatching
{
class EMFX_API DiscardFrameEventData
: public EventData
{
public:
AZ_RTTI(DiscardFrameEventData, "{25499823-E611-4958-85B7-476BC1918744}", EventData);
AZ_CLASS_ALLOCATOR_DECL
DiscardFrameEventData() = default;
~DiscardFrameEventData() override = default;
static void Reflect(AZ::ReflectContext* context);
bool Equal(const EventData& rhs, bool ignoreEmptyFields = false) const override;
private:
AZStd::string m_tag;
};
class EMFX_API TagEventData
: public EventData
{
public:
AZ_RTTI(TagEventData, "{FEFEA2C7-CD68-43B2-94D6-85559E29EABF}", EventData);
AZ_CLASS_ALLOCATOR_DECL
TagEventData() = default;
~TagEventData() override = default;
static void Reflect(AZ::ReflectContext* context);
bool Equal(const EventData& rhs, bool ignoreEmptyFields = false) const override;
private:
AZStd::string m_tag;
};
} // namespace EMotionFX::MotionMatching

@ -0,0 +1,275 @@
/*
* Copyright (c) Contributors to the Open 3D Engine Project.
* For complete copyright and license terms please see the LICENSE at the root of this distribution.
*
* SPDX-License-Identifier: Apache-2.0 OR MIT
*
*/
#include <EMotionFX/Source/ActorInstance.h>
#include <Allocators.h>
#include <EMotionFX/Source/AnimGraphPose.h>
#include <EMotionFX/Source/AnimGraphPosePool.h>
#include <EMotionFX/Source/EMotionFXManager.h>
#include <EMotionFX/Source/MotionInstance.h>
#include <Feature.h>
#include <Frame.h>
#include <EMotionFX/Source/Pose.h>
#include <EMotionFX/Source/TransformData.h>
#include <EMotionFX/Source/Velocity.h>
#include <MCore/Source/AzCoreConversions.h>
#include <MCore/Source/Color.h>
#include <AzCore/Serialization/EditContext.h>
#include <AzCore/Serialization/SerializeContext.h>
namespace EMotionFX::MotionMatching
{
AZ_CLASS_ALLOCATOR_IMPL(Feature, MotionMatchAllocator, 0)
bool Feature::Init(const InitSettings& settings)
{
const Actor* actor = settings.m_actorInstance->GetActor();
const Skeleton* skeleton = actor->GetSkeleton();
const Node* joint = skeleton->FindNodeByNameNoCase(m_jointName.c_str());
m_jointIndex = joint ? joint->GetNodeIndex() : InvalidIndex;
if (m_jointIndex == InvalidIndex)
{
AZ_Error("MotionMatching", false, "Feature::Init(): Cannot find index for joint named '%s'.", m_jointName.c_str());
return false;
}
const Node* relativeToJoint = skeleton->FindNodeByNameNoCase(m_relativeToJointName.c_str());
m_relativeToNodeIndex = relativeToJoint ? relativeToJoint->GetNodeIndex() : InvalidIndex;
if (m_relativeToNodeIndex == InvalidIndex)
{
AZ_Error("MotionMatching", false, "Feature::Init(): Cannot find index for joint named '%s'.", m_relativeToJointName.c_str());
return false;
}
// Set a default feature name in case it did not get set manually.
if (m_name.empty())
{
AZStd::string featureTypeName = this->RTTI_GetTypeName();
AzFramework::StringFunc::Replace(featureTypeName, "Feature", "");
m_name = AZStd::string::format("%s (%s)", featureTypeName.c_str(), m_jointName.c_str());
}
return true;
}
void Feature::SetDebugDrawColor(const AZ::Color& color)
{
m_debugColor = color;
}
const AZ::Color& Feature::GetDebugDrawColor() const
{
return m_debugColor;
}
void Feature::SetDebugDrawEnabled(bool enabled)
{
m_debugDrawEnabled = enabled;
}
bool Feature::GetDebugDrawEnabled() const
{
return m_debugDrawEnabled;
}
float Feature::CalculateFrameCost([[maybe_unused]] size_t frameIndex, [[maybe_unused]] const FrameCostContext& context) const
{
AZ_Assert(false, "Feature::CalculateFrameCost(): Not implemented for the given feature.");
return 0.0f;
}
void Feature::SetRelativeToNodeIndex(size_t nodeIndex)
{
m_relativeToNodeIndex = nodeIndex;
}
void Feature::CalculateVelocity(size_t jointIndex, size_t relativeToJointIndex, MotionInstance* motionInstance, AZ::Vector3& outVelocity)
{
const float originalTime = motionInstance->GetCurrentTime();
// Prepare for sampling.
ActorInstance* actorInstance = motionInstance->GetActorInstance();
AnimGraphPosePool& posePool = GetEMotionFX().GetThreadData(actorInstance->GetThreadIndex())->GetPosePool();
AnimGraphPose* prevPose = posePool.RequestPose(actorInstance);
AnimGraphPose* currentPose = posePool.RequestPose(actorInstance);
Pose* bindPose = actorInstance->GetTransformData()->GetBindPose();
const size_t numSamples = 3;
const float timeRange = 0.05f; // secs
const float halfTimeRange = timeRange * 0.5f;
const float startTime = originalTime - halfTimeRange;
const float frameDelta = timeRange / numSamples;
AZ::Vector3 accumulatedVelocity = AZ::Vector3::CreateZero();
for (size_t sampleIndex = 0; sampleIndex < numSamples + 1; ++sampleIndex)
{
float sampleTime = startTime + sampleIndex * frameDelta;
if (sampleTime < 0.0f)
{
sampleTime = 0.0f;
}
if (sampleTime >= motionInstance->GetMotion()->GetDuration())
{
sampleTime = motionInstance->GetMotion()->GetDuration();
}
if (sampleIndex == 0)
{
motionInstance->SetCurrentTime(sampleTime);
motionInstance->GetMotion()->Update(bindPose, &prevPose->GetPose(), motionInstance);
continue;
}
motionInstance->SetCurrentTime(sampleTime);
motionInstance->GetMotion()->Update(bindPose, &currentPose->GetPose(), motionInstance);
const Transform inverseJointWorldTransform = currentPose->GetPose().GetWorldSpaceTransform(relativeToJointIndex).Inversed();
// Calculate the velocity.
const AZ::Vector3 prevPosition = prevPose->GetPose().GetWorldSpaceTransform(jointIndex).m_position;
const AZ::Vector3 currentPosition = currentPose->GetPose().GetWorldSpaceTransform(jointIndex).m_position;
const AZ::Vector3 velocity = CalculateLinearVelocity(prevPosition, currentPosition, frameDelta);
accumulatedVelocity += inverseJointWorldTransform.TransformVector(velocity);
*prevPose = *currentPose;
}
outVelocity = accumulatedVelocity / aznumeric_cast<float>(numSamples);
motionInstance->SetCurrentTime(originalTime); // set back to what it was
posePool.FreePose(prevPose);
posePool.FreePose(currentPose);
}
void Feature::CalculateVelocity(const ActorInstance* actorInstance, size_t jointIndex, size_t relativeToJointIndex, const Frame& frame, AZ::Vector3& outVelocity)
{
AnimGraphPosePool& posePool = GetEMotionFX().GetThreadData(actorInstance->GetThreadIndex())->GetPosePool();
AnimGraphPose* prevPose = posePool.RequestPose(actorInstance);
AnimGraphPose* currentPose = posePool.RequestPose(actorInstance);
const size_t numSamples = 3;
const float timeRange = 0.05f; // secs
const float halfTimeRange = timeRange * 0.5f;
const float frameDelta = timeRange / numSamples;
AZ::Vector3 accumulatedVelocity = AZ::Vector3::CreateZero();
for (size_t sampleIndex = 0; sampleIndex < numSamples + 1; ++sampleIndex)
{
const float sampleTimeOffset = (-halfTimeRange) + sampleIndex * frameDelta;
if (sampleIndex == 0)
{
frame.SamplePose(&prevPose->GetPose(), sampleTimeOffset);
continue;
}
frame.SamplePose(&currentPose->GetPose(), sampleTimeOffset);
const Transform inverseJointWorldTransform = currentPose->GetPose().GetWorldSpaceTransform(relativeToJointIndex).Inversed();
// Calculate the velocity.
const AZ::Vector3 prevPosition = prevPose->GetPose().GetWorldSpaceTransform(jointIndex).m_position;
const AZ::Vector3 currentPosition = currentPose->GetPose().GetWorldSpaceTransform(jointIndex).m_position;
const AZ::Vector3 velocity = CalculateLinearVelocity(prevPosition, currentPosition, frameDelta);
accumulatedVelocity += inverseJointWorldTransform.TransformVector(velocity);
*prevPose = *currentPose;
}
outVelocity = accumulatedVelocity / aznumeric_cast<float>(numSamples);
posePool.FreePose(prevPose);
posePool.FreePose(currentPose);
}
float Feature::GetNormalizedDirectionDifference(const AZ::Vector2& directionA, const AZ::Vector2& directionB) const
{
const float dotProduct = directionA.GetNormalized().Dot(directionB.GetNormalized());
const float normalizedDirectionDifference = (2.0f - (1.0f + dotProduct)) * 0.5f;
return AZ::GetAbs(normalizedDirectionDifference);
}
float Feature::GetNormalizedDirectionDifference(const AZ::Vector3& directionA, const AZ::Vector3& directionB) const
{
const float dotProduct = directionA.GetNormalized().Dot(directionB.GetNormalized());
const float normalizedDirectionDifference = (2.0f - (1.0f + dotProduct)) * 0.5f;
return AZ::GetAbs(normalizedDirectionDifference);
}
float Feature::CalcResidual(float value) const
{
if (m_residualType == ResidualType::Squared)
{
return value * value;
}
return AZ::Abs(value);
}
float Feature::CalcResidual(const AZ::Vector3& a, const AZ::Vector3& b) const
{
const float euclideanDistance = (b - a).GetLength();
return CalcResidual(euclideanDistance);
}
AZ::Crc32 Feature::GetCostFactorVisibility() const
{
return AZ::Edit::PropertyVisibility::Show;
}
void Feature::Reflect(AZ::ReflectContext* context)
{
AZ::SerializeContext* serializeContext = azrtti_cast<AZ::SerializeContext*>(context);
if (!serializeContext)
{
return;
}
serializeContext->Class<Feature>()
->Version(2)
->Field("id", &Feature::m_id)
->Field("name", &Feature::m_name)
->Field("jointName", &Feature::m_jointName)
->Field("relativeToJointName", &Feature::m_relativeToJointName)
->Field("debugDraw", &Feature::m_debugDrawEnabled)
->Field("debugColor", &Feature::m_debugColor)
->Field("costFactor", &Feature::m_costFactor)
->Field("residualType", &Feature::m_residualType)
;
AZ::EditContext* editContext = serializeContext->GetEditContext();
if (!editContext)
{
return;
}
editContext->Class<Feature>("Feature", "Base class for a feature")
->ClassElement(AZ::Edit::ClassElements::EditorData, "")
->Attribute(AZ::Edit::Attributes::AutoExpand, "")
->DataElement(AZ::Edit::UIHandlers::Default, &Feature::m_name, "Name", "Custom name of the feature used for identification and debug visualizations.")
->DataElement(AZ_CRC_CE("ActorNode"), &Feature::m_jointName, "Joint", "The joint to extract the data from.")
->DataElement(AZ_CRC_CE("ActorNode"), &Feature::m_relativeToJointName, "Relative To Joint", "When extracting feature data, convert it to relative-space to the given joint.")
->DataElement(AZ::Edit::UIHandlers::Default, &Feature::m_debugDrawEnabled, "Debug Draw", "Are debug visualizations enabled for this feature?")
->DataElement(AZ::Edit::UIHandlers::Default, &Feature::m_debugColor, "Debug Draw Color", "Color used for debug visualizations to identify the feature.")
->DataElement(AZ::Edit::UIHandlers::SpinBox, &Feature::m_costFactor, "Cost Factor", "The cost factor for the feature is multiplied with the actual and can be used to change a feature's influence in the motion matching search.")
->Attribute(AZ::Edit::Attributes::Min, 0.0f)
->Attribute(AZ::Edit::Attributes::Max, 100.0f)
->Attribute(AZ::Edit::Attributes::Step, 0.1f)
->Attribute(AZ::Edit::Attributes::Visibility, &Feature::GetCostFactorVisibility)
->DataElement(AZ::Edit::UIHandlers::ComboBox, &Feature::m_residualType, "Residual", "Use 'Squared' in case minimal differences should be ignored and larger differences should overweight others. Use 'Absolute' for linear differences and don't want the mentioned effect.")
->EnumAttribute(ResidualType::Absolute, "Absolute")
->EnumAttribute(ResidualType::Squared, "Squared")
;
}
} // namespace EMotionFX::MotionMatching

@ -0,0 +1,174 @@
/*
* Copyright (c) Contributors to the Open 3D Engine Project.
* For complete copyright and license terms please see the LICENSE at the root of this distribution.
*
* SPDX-License-Identifier: Apache-2.0 OR MIT
*
*/
#pragma once
#include <AzCore/Math/Color.h>
#include <AzCore/Memory/Memory.h>
#include <AzCore/RTTI/RTTI.h>
#include <EMotionFX/Source/EMotionFXConfig.h>
#include <EMotionFX/Source/Node.h>
#include <EMotionFX/Source/Skeleton.h>
#include <AzFramework/Entity/EntityDebugDisplayBus.h>
#include <FeatureMatrix.h>
namespace EMotionFX
{
class ActorInstance;
class MotionInstance;
class Pose;
class Motion;
};
namespace EMotionFX::MotionMatching
{
class Frame;
class FrameDatabase;
class MotionMatchingInstance;
class TrajectoryQuery;
class EMFX_API Feature
{
public:
AZ_RTTI(Feature, "{DE9CBC48-9176-4DF1-8306-4B1E621F0E76}")
AZ_CLASS_ALLOCATOR_DECL
Feature() = default;
virtual ~Feature() = default;
////////////////////////////////////////////////////////////////////////
// Initialization
struct EMFX_API InitSettings
{
ActorInstance* m_actorInstance = nullptr;
FeatureMatrix::Index m_featureColumnStartOffset = 0;
};
virtual bool Init(const InitSettings& settings);
////////////////////////////////////////////////////////////////////////
// Feature extraction
struct EMFX_API ExtractFeatureContext
{
ExtractFeatureContext(FeatureMatrix& featureMatrix)
: m_featureMatrix(featureMatrix)
{
}
FrameDatabase* m_frameDatabase = nullptr;
FeatureMatrix& m_featureMatrix;
size_t m_frameIndex = InvalidIndex;
const Pose* m_framePose = nullptr; //! Pre-sampled pose for the given frame.
ActorInstance* m_actorInstance = nullptr;
};
virtual void ExtractFeatureValues(const ExtractFeatureContext& context) = 0;
////////////////////////////////////////////////////////////////////////
// Feature cost
struct EMFX_API FrameCostContext
{
FrameCostContext(const FeatureMatrix& featureMatrix, const Pose& currentPose)
: m_featureMatrix(featureMatrix)
, m_currentPose(currentPose)
{
}
const FeatureMatrix& m_featureMatrix;
const ActorInstance* m_actorInstance = nullptr;
const Pose& m_currentPose; //! Current actor instance pose.
const TrajectoryQuery* m_trajectoryQuery;
};
virtual float CalculateFrameCost(size_t frameIndex, const FrameCostContext& context) const;
//! Specifies how the feature value differences (residuals), between the input query values
//! and the frames in the motion database that sum up the feature cost, are calculated.
enum ResidualType
{
Absolute,
Squared
};
void SetCostFactor(float costFactor) { m_costFactor = costFactor; }
float GetCostFactor() const { return m_costFactor; }
virtual void FillQueryFeatureValues([[maybe_unused]] size_t startIndex,
[[maybe_unused]] AZStd::vector<float>& queryFeatureValues,
[[maybe_unused]] const FrameCostContext& context) {}
virtual void DebugDraw([[maybe_unused]] AzFramework::DebugDisplayRequests& debugDisplay,
[[maybe_unused]] MotionMatchingInstance* instance,
[[maybe_unused]] size_t frameIndex) {}
void SetDebugDrawColor(const AZ::Color& color);
const AZ::Color& GetDebugDrawColor() const;
void SetDebugDrawEnabled(bool enabled);
bool GetDebugDrawEnabled() const;
void SetJointName(const AZStd::string& jointName) { m_jointName = jointName; }
const AZStd::string& GetJointName() const { return m_jointName; }
void SetRelativeToJointName(const AZStd::string& jointName) { m_relativeToJointName = jointName; }
const AZStd::string& GetRelativeToJointName() const { return m_relativeToJointName; }
void SetName(const AZStd::string& name) { m_name = name; }
const AZStd::string& GetName() const { return m_name; }
// Column offset for the first value for the given feature inside the feature matrix.
virtual size_t GetNumDimensions() const = 0;
virtual AZStd::string GetDimensionName([[maybe_unused]] size_t index) const { return "Unknown"; }
FeatureMatrix::Index GetColumnOffset() const { return m_featureColumnOffset; }
void SetColumnOffset(FeatureMatrix::Index offset) { m_featureColumnOffset = offset; }
const AZ::TypeId& GetId() const { return m_id; }
size_t GetRelativeToNodeIndex() const { return m_relativeToNodeIndex; }
void SetRelativeToNodeIndex(size_t nodeIndex);
static void Reflect(AZ::ReflectContext* context);
static void CalculateVelocity(size_t jointIndex, size_t relativeToJointIndex, MotionInstance* motionInstance, AZ::Vector3& outVelocity);
static void CalculateVelocity(const ActorInstance* actorInstance, size_t jointIndex, size_t relativeToJointIndex, const Frame& frame, AZ::Vector3& outVelocity);
protected:
/**
* Calculate a normalized direction vector difference between the two given vectors.
* A dot product of the two vectors is taken and the result in range [-1, 1] is scaled to [0, 1].
* @result Normalized, absolute difference between the vectors.
* Angle difference dot result cost
* 0.0 degrees 1.0 0.0
* 90.0 degrees 0.0 0.5
* 180.0 degrees -1.0 1.0
* 270.0 degrees 0.0 0.5
**/
float GetNormalizedDirectionDifference(const AZ::Vector2& directionA, const AZ::Vector2& directionB) const;
float GetNormalizedDirectionDifference(const AZ::Vector3& directionA, const AZ::Vector3& directionB) const;
float CalcResidual(float value) const;
float CalcResidual(const AZ::Vector3& a, const AZ::Vector3& b) const;
virtual AZ::Crc32 GetCostFactorVisibility() const;
// Shared and reflected data.
AZ::TypeId m_id = AZ::TypeId::CreateRandom(); //< The feature identification number. Use this instead of the RTTI class ID so that we can have multiple of the same type.
AZStd::string m_name; //< Display name used for feature identification and debug visualizations.
AZStd::string m_jointName; //< Joint name to extract the data from.
AZStd::string m_relativeToJointName; //< When extracting feature data, convert it to relative-space to the given joint.
AZ::Color m_debugColor = AZ::Colors::Green; //< Color used for debug visualizations to identify the feature.
bool m_debugDrawEnabled = false; //< Are debug visualizations enabled for this feature?
float m_costFactor = 1.0f; //< The cost factor for the feature is multiplied with the actual and can be used to change a feature's influence in the motion matching search.
ResidualType m_residualType = ResidualType::Squared; //< How do we calculate the differences (residuals) between the input query values and the frames in the motion database that sum up the feature cost.
// Instance data (depends on the feature schema or actor instance).
FeatureMatrix::Index m_featureColumnOffset; //< Float/Value offset, starting column for where the feature should be places at.
size_t m_relativeToNodeIndex = InvalidIndex;
size_t m_jointIndex = InvalidIndex;
};
} // namespace EMotionFX::MotionMatching

@ -0,0 +1,102 @@
/*
* Copyright (c) Contributors to the Open 3D Engine Project.
* For complete copyright and license terms please see the LICENSE at the root of this distribution.
*
* SPDX-License-Identifier: Apache-2.0 OR MIT
*
*/
#include <iostream>
#include <fstream>
#include <Allocators.h>
#include <FeatureMatrix.h>
#include <FeatureSchema.h>
namespace EMotionFX::MotionMatching
{
AZ_CLASS_ALLOCATOR_IMPL(FeatureMatrix, MotionMatchAllocator, 0)
void FeatureMatrix::Clear()
{
resize(0, 0);
}
void FeatureMatrix::SaveAsCsv(const AZStd::string& filename, const AZStd::vector<AZStd::string>& columnNames)
{
std::ofstream file(filename.c_str());
// Save column names in the first row
if (!columnNames.empty())
{
for (size_t i = 0; i < columnNames.size(); ++i)
{
if (i != 0)
{
file << ",";
}
file << columnNames[i].c_str();
}
file << "\n";
}
// Save coefficients
#ifdef O3DE_USE_EIGEN
// Force specify precision, else wise values close to 0.0 get rounded to 0.0.
const static Eigen::IOFormat csvFormat(/*Eigen::StreamPrecision|FullPrecision*/8, Eigen::DontAlignCols, ", ", "\n");
file << format(csvFormat);
#endif
}
void FeatureMatrix::SaveAsCsv(const AZStd::string& filename, const FeatureSchema* featureSchema)
{
AZStd::vector<AZStd::string> columnNames;
for (Feature* feature: featureSchema->GetFeatures())
{
const size_t numDimensions = feature->GetNumDimensions();
for (size_t dimension = 0; dimension < numDimensions; ++dimension)
{
columnNames.push_back(feature->GetDimensionName(dimension));
}
}
SaveAsCsv(filename, columnNames);
}
AZ::Vector2 FeatureMatrix::GetVector2(Index row, Index startColumn) const
{
return AZ::Vector2(
coeff(row, startColumn + 0),
coeff(row, startColumn + 1));
}
void FeatureMatrix::SetVector2(Index row, Index startColumn, const AZ::Vector2& value)
{
operator()(row, startColumn + 0) = value.GetX();
operator()(row, startColumn + 1) = value.GetY();
}
AZ::Vector3 FeatureMatrix::GetVector3(Index row, Index startColumn) const
{
return AZ::Vector3(
coeff(row, startColumn + 0),
coeff(row, startColumn + 1),
coeff(row, startColumn + 2));
}
void FeatureMatrix::SetVector3(Index row, Index startColumn, const AZ::Vector3& value)
{
operator()(row, startColumn + 0) = value.GetX();
operator()(row, startColumn + 1) = value.GetY();
operator()(row, startColumn + 2) = value.GetZ();
}
size_t FeatureMatrix::CalcMemoryUsageInBytes() const
{
const size_t bytesPerValue = sizeof(O3DE_MM_FLOATTYPE);
const size_t numValues = size();
return numValues * bytesPerValue;
}
} // namespace EMotionFX::MotionMatching

@ -0,0 +1,118 @@
/*
* Copyright (c) Contributors to the Open 3D Engine Project.
* For complete copyright and license terms please see the LICENSE at the root of this distribution.
*
* SPDX-License-Identifier: Apache-2.0 OR MIT
*
*/
#pragma once
#include <AzCore/Memory/Memory.h>
#include <AzCore/RTTI/RTTI.h>
#include <AzCore/Math/Vector2.h>
#include <AzCore/Math/Vector3.h>
#include <AzCore/std/containers/vector.h>
#include <AzCore/std/string/string.h>
//#define O3DE_USE_EIGEN
#define O3DE_MM_FLOATTYPE float
#ifdef O3DE_USE_EIGEN
#pragma warning (push, 1)
#pragma warning (disable:4834) // C4834: discarding return value of function with 'nodiscard' attribute
#pragma warning (disable:5031) // #pragma warning(pop): likely mismatch, popping warning state pushed in different file
#pragma warning (disable:4702) // warning C4702: unreachable code
#pragma warning (disable:4723) // warning C4723: potential divide by 0
#include "../../3rdParty/eigen-3.3.9/Eigen/Dense"
#pragma warning (pop)
#endif
namespace EMotionFX::MotionMatching
{
class FeatureSchema;
#ifdef O3DE_USE_EIGEN
// Features are stored in columns, each row represents a frame
// RowMajor: Store row components next to each other in memory for cache-optimized feature access for a given frame.
using FeatureMatrixType = Eigen::Matrix<O3DE_MM_FLOATTYPE, Eigen::Dynamic, Eigen::Dynamic, Eigen::RowMajor>;
#else
/**
* Small wrapper for a 2D matrix similar to the Eigen::Matrix.
*/
class FeatureMatrixType
{
public:
size_t size() const
{
return m_data.size();
}
size_t rows() const
{
return m_rowCount;
}
size_t cols() const
{
return m_columnCount;
}
void resize(size_t rowCount, size_t columnCount)
{
m_rowCount = rowCount;
m_columnCount = columnCount;
m_data.resize(m_rowCount * m_columnCount);
}
float& operator()(size_t row, size_t column)
{
return m_data[row * m_columnCount + column];
}
const float& operator()(size_t row, size_t column) const
{
return m_data[row * m_columnCount + column];
}
float coeff(size_t row, size_t column) const
{
return m_data[row * m_columnCount + column];
}
private:
AZStd::vector<float> m_data;
size_t m_rowCount = 0;
size_t m_columnCount = 0;
};
#endif
class FeatureMatrix
: public FeatureMatrixType
{
public:
AZ_RTTI(FeatureMatrix, "{E063C9CB-7147-4776-A6E0-98584DD93FEF}");
AZ_CLASS_ALLOCATOR_DECL
#ifdef O3DE_USE_EIGEN
using Index = Eigen::Index;
#else
using Index = size_t;
#endif
virtual ~FeatureMatrix() = default;
void Clear();
void SaveAsCsv(const AZStd::string& filename, const AZStd::vector<AZStd::string>& columnNames = {});
void SaveAsCsv(const AZStd::string& filename, const FeatureSchema* featureSchema);
size_t CalcMemoryUsageInBytes() const;
AZ::Vector2 GetVector2(Index row, Index startColumn) const;
void SetVector2(Index row, Index startColumn, const AZ::Vector2& value);
AZ::Vector3 GetVector3(Index row, Index startColumn) const;
void SetVector3(Index row, Index startColumn, const AZ::Vector3& value);
};
} // namespace EMotionFX::MotionMatching

@ -0,0 +1,127 @@
/*
* Copyright (c) Contributors to the Open 3D Engine Project.
* For complete copyright and license terms please see the LICENSE at the root of this distribution.
*
* SPDX-License-Identifier: Apache-2.0 OR MIT
*
*/
#include <Allocators.h>
#include <EMotionFX/Source/ActorInstance.h>
#include <EMotionFX/Source/EMotionFXManager.h>
#include <EMotionFX/Source/EventManager.h>
#include <EMotionFX/Source/TransformData.h>
#include <MotionMatchingData.h>
#include <MotionMatchingInstance.h>
#include <FrameDatabase.h>
#include <FeaturePosition.h>
#include <AzCore/Serialization/EditContext.h>
#include <AzCore/Serialization/SerializeContext.h>
#include <MCore/Source/AzCoreConversions.h>
namespace EMotionFX::MotionMatching
{
AZ_CLASS_ALLOCATOR_IMPL(FeaturePosition, MotionMatchAllocator, 0)
void FeaturePosition::FillQueryFeatureValues(size_t startIndex, AZStd::vector<float>& queryFeatureValues, const FrameCostContext& context)
{
const Transform invRootTransform = context.m_currentPose.GetWorldSpaceTransform(m_relativeToNodeIndex).Inversed();
const AZ::Vector3 worldInputPosition = context.m_currentPose.GetWorldSpaceTransform(m_jointIndex).m_position;
const AZ::Vector3 relativeInputPosition = invRootTransform.TransformPoint(worldInputPosition);
queryFeatureValues[startIndex + 0] = relativeInputPosition.GetX();
queryFeatureValues[startIndex + 1] = relativeInputPosition.GetY();
queryFeatureValues[startIndex + 2] = relativeInputPosition.GetZ();
}
void FeaturePosition::ExtractFeatureValues(const ExtractFeatureContext& context)
{
const Transform invRootTransform = context.m_framePose->GetWorldSpaceTransform(m_relativeToNodeIndex).Inversed();
const AZ::Vector3 nodeWorldPosition = context.m_framePose->GetWorldSpaceTransform(m_jointIndex).m_position;
const AZ::Vector3 position = invRootTransform.TransformPoint(nodeWorldPosition);
SetFeatureData(context.m_featureMatrix, context.m_frameIndex, position);
}
void FeaturePosition::DebugDraw(AzFramework::DebugDisplayRequests& debugDisplay,
MotionMatchingInstance* instance,
size_t frameIndex)
{
const MotionMatchingData* data = instance->GetData();
const ActorInstance* actorInstance = instance->GetActorInstance();
const Pose* pose = actorInstance->GetTransformData()->GetCurrentPose();
const Transform jointModelTM = pose->GetModelSpaceTransform(m_jointIndex);
const Transform relativeToWorldTM = pose->GetWorldSpaceTransform(m_relativeToNodeIndex);
const AZ::Vector3 position = GetFeatureData(data->GetFeatureMatrix(), frameIndex);
const AZ::Vector3 transformedPos = relativeToWorldTM.TransformPoint(position);
constexpr float markerSize = 0.03f;
debugDisplay.DepthTestOff();
debugDisplay.SetColor(m_debugColor);
debugDisplay.DrawBall(transformedPos, markerSize, /*drawShaded=*/false);
}
float FeaturePosition::CalculateFrameCost(size_t frameIndex, const FrameCostContext& context) const
{
const Transform invRootTransform = context.m_currentPose.GetWorldSpaceTransform(m_relativeToNodeIndex).Inversed();
const AZ::Vector3 worldInputPosition = context.m_currentPose.GetWorldSpaceTransform(m_jointIndex).m_position;
const AZ::Vector3 relativeInputPosition = invRootTransform.TransformPoint(worldInputPosition);
const AZ::Vector3 framePosition = GetFeatureData(context.m_featureMatrix, frameIndex); // This is already relative to the root node
return CalcResidual(relativeInputPosition, framePosition);
}
void FeaturePosition::Reflect(AZ::ReflectContext* context)
{
AZ::SerializeContext* serializeContext = azrtti_cast<AZ::SerializeContext*>(context);
if (!serializeContext)
{
return;
}
serializeContext->Class<FeaturePosition, Feature>()
->Version(1);
AZ::EditContext* editContext = serializeContext->GetEditContext();
if (!editContext)
{
return;
}
editContext->Class<FeaturePosition>("FeaturePosition", "Matches joint positions.")
->ClassElement(AZ::Edit::ClassElements::EditorData, "")
->Attribute(AZ::Edit::Attributes::AutoExpand, "")
;
}
size_t FeaturePosition::GetNumDimensions() const
{
return 3;
}
AZStd::string FeaturePosition::GetDimensionName(size_t index) const
{
AZStd::string result = m_jointName;
result += '.';
switch (index)
{
case 0: { result += "PosX"; break; }
case 1: { result += "PosY"; break; }
case 2: { result += "PosZ"; break; }
default: { result += Feature::GetDimensionName(index); }
}
return result;
}
AZ::Vector3 FeaturePosition::GetFeatureData(const FeatureMatrix& featureMatrix, size_t frameIndex) const
{
return featureMatrix.GetVector3(frameIndex, m_featureColumnOffset);
}
void FeaturePosition::SetFeatureData(FeatureMatrix& featureMatrix, size_t frameIndex, const AZ::Vector3& position)
{
featureMatrix.SetVector3(frameIndex, m_featureColumnOffset, position);
}
} // namespace EMotionFX::MotionMatching

@ -0,0 +1,55 @@
/*
* Copyright (c) Contributors to the Open 3D Engine Project.
* For complete copyright and license terms please see the LICENSE at the root of this distribution.
*
* SPDX-License-Identifier: Apache-2.0 OR MIT
*
*/
#pragma once
#include <AzCore/Math/Vector3.h>
#include <AzCore/Memory/Memory.h>
#include <AzCore/RTTI/RTTI.h>
#include <AzCore/RTTI/TypeInfo.h>
#include <AzCore/std/containers/vector.h>
#include <EMotionFX/Source/EMotionFXConfig.h>
#include <Feature.h>
namespace AZ
{
class ReflectContext;
}
namespace EMotionFX::MotionMatching
{
class FrameDatabase;
class EMFX_API FeaturePosition
: public Feature
{
public:
AZ_RTTI(FeaturePosition, "{3EAA6459-DB59-4EA1-B8B3-C933A83AA77D}", Feature)
AZ_CLASS_ALLOCATOR_DECL
FeaturePosition() = default;
~FeaturePosition() override = default;
void ExtractFeatureValues(const ExtractFeatureContext& context) override;
void DebugDraw(AzFramework::DebugDisplayRequests& debugDisplay,
MotionMatchingInstance* instance,
size_t frameIndex) override;
float CalculateFrameCost(size_t frameIndex, const FrameCostContext& context) const override;
void FillQueryFeatureValues(size_t startIndex, AZStd::vector<float>& queryFeatureValues, const FrameCostContext& context) override;
static void Reflect(AZ::ReflectContext* context);
size_t GetNumDimensions() const override;
AZStd::string GetDimensionName(size_t index) const override;
AZ::Vector3 GetFeatureData(const FeatureMatrix& featureMatrix, size_t frameIndex) const;
void SetFeatureData(FeatureMatrix& featureMatrix, size_t frameIndex, const AZ::Vector3& position);
};
} // namespace EMotionFX::MotionMatching

@ -0,0 +1,123 @@
/*
* Copyright (c) Contributors to the Open 3D Engine Project.
* For complete copyright and license terms please see the LICENSE at the root of this distribution.
*
* SPDX-License-Identifier: Apache-2.0 OR MIT
*
*/
#include <AzCore/Serialization/EditContext.h>
#include <AzCore/Serialization/SerializeContext.h>
#include <AzCore/Component/ComponentApplicationBus.h>
#include <Allocators.h>
#include <FeatureSchema.h>
namespace EMotionFX::MotionMatching
{
AZ_CLASS_ALLOCATOR_IMPL(FeatureSchema, MotionMatchAllocator, 0)
FeatureSchema::~FeatureSchema()
{
Clear();
}
Feature* FeatureSchema::GetFeature(size_t index) const
{
return m_features[index];
}
const AZStd::vector<Feature*>& FeatureSchema::GetFeatures() const
{
return m_features;
}
void FeatureSchema::AddFeature(Feature* feature)
{
// Try to see if there is a feature with the same id already.
auto iterator = AZStd::find_if(m_featuresById.begin(), m_featuresById.end(), [&feature](const auto& curEntry) -> bool {
return (feature->GetId() == curEntry.second->GetId());
});
if (iterator != m_featuresById.end())
{
AZ_Assert(false, "Cannot add feature. Feature with id '%s' has already been registered.", feature->GetId().data);
return;
}
m_featuresById.emplace(feature->GetId(), feature);
m_features.emplace_back(feature);
}
void FeatureSchema::Clear()
{
for (Feature* feature : m_features)
{
delete feature;
}
m_featuresById.clear();
m_features.clear();
}
size_t FeatureSchema::GetNumFeatures() const
{
return m_features.size();
}
Feature* FeatureSchema::FindFeatureById(const AZ::TypeId& featureId) const
{
const auto result = m_featuresById.find(featureId);
if (result == m_featuresById.end())
{
return nullptr;
}
return result->second;
}
Feature* FeatureSchema::CreateFeatureByType(const AZ::TypeId& typeId)
{
AZ::SerializeContext* context = nullptr;
AZ::ComponentApplicationBus::BroadcastResult(context, &AZ::ComponentApplicationBus::Events::GetSerializeContext);
if (!context)
{
AZ_Error("Motion Matching", false, "Can't get serialize context from component application.");
return nullptr;
}
const AZ::SerializeContext::ClassData* classData = context->FindClassData(typeId);
if (!classData)
{
AZ_Warning("Motion Matching", false, "Can't find class data for this type.");
return nullptr;
}
Feature* featureObject = reinterpret_cast<Feature*>(classData->m_factory->Create(classData->m_name));
return featureObject;
}
void FeatureSchema::Reflect(AZ::ReflectContext* context)
{
AZ::SerializeContext* serializeContext = azrtti_cast<AZ::SerializeContext*>(context);
if (!serializeContext)
{
return;
}
serializeContext->Class<FeatureSchema>()
->Version(1)
->Field("features", &FeatureSchema::m_features);
AZ::EditContext* editContext = serializeContext->GetEditContext();
if (!editContext)
{
return;
}
editContext->Class<FeatureSchema>("FeatureSchema", "")
->ClassElement(AZ::Edit::ClassElements::EditorData, "")
->DataElement(AZ::Edit::UIHandlers::Default, &FeatureSchema::m_features, "Features", "")
->Attribute(AZ::Edit::Attributes::AutoExpand, "")
;
}
} // namespace EMotionFX::MotionMatching

@ -0,0 +1,47 @@
/*
* Copyright (c) Contributors to the Open 3D Engine Project.
* For complete copyright and license terms please see the LICENSE at the root of this distribution.
*
* SPDX-License-Identifier: Apache-2.0 OR MIT
*
*/
#pragma once
#include <AzCore/Memory/Memory.h>
#include <AzCore/RTTI/RTTI.h>
#include <AzCore/std/containers/unordered_map.h>
#include <AzCore/std/containers/vector.h>
#include <Feature.h>
namespace EMotionFX::MotionMatching
{
//! The set of features involved in the motion matching search.
//! The schema represents the order of the features as well as their settings while the feature matrix stores the actual feature data.
class EMFX_API FeatureSchema
{
public:
AZ_RTTI(FrameDatabase, "{E34F6BFE-73DB-4DED-AAB9-09FBC5113236}")
AZ_CLASS_ALLOCATOR_DECL
virtual ~FeatureSchema();
void AddFeature(Feature* feature);
void Clear();
size_t GetNumFeatures() const;
Feature* GetFeature(size_t index) const;
const AZStd::vector<Feature*>& GetFeatures() const;
Feature* FindFeatureById(const AZ::TypeId& featureId) const;
static void Reflect(AZ::ReflectContext* context);
protected:
static Feature* CreateFeatureByType(const AZ::TypeId& typeId);
AZStd::vector<Feature*> m_features; //< Ordered set of features (Owns the feature objects).
AZStd::unordered_map<AZ::TypeId, Feature*> m_featuresById; //< Hash-map for fast access to the features by ID. (Weak ownership)
};
} // namespace EMotionFX::MotionMatching

@ -0,0 +1,82 @@
/*
* Copyright (c) Contributors to the Open 3D Engine Project.
* For complete copyright and license terms please see the LICENSE at the root of this distribution.
*
* SPDX-License-Identifier: Apache-2.0 OR MIT
*
*/
#include <FeaturePosition.h>
#include <FeatureTrajectory.h>
#include <FeatureVelocity.h>
#include <FeatureSchemaDefault.h>
namespace EMotionFX::MotionMatching
{
void DefaultFeatureSchema(FeatureSchema& featureSchema, DefaultFeatureSchemaInitSettings settings)
{
featureSchema.Clear();
const AZStd::string rootJointName = settings.m_rootJointName;
//----------------------------------------------------------------------------------------------------------
// Past and future root trajectory
FeatureTrajectory* rootTrajectory = aznew FeatureTrajectory();
rootTrajectory->SetJointName(rootJointName);
rootTrajectory->SetRelativeToJointName(rootJointName);
rootTrajectory->SetDebugDrawColor(AZ::Color::CreateFromRgba(157,78,221,255));
rootTrajectory->SetDebugDrawEnabled(true);
featureSchema.AddFeature(rootTrajectory);
//----------------------------------------------------------------------------------------------------------
// Left foot position
FeaturePosition* leftFootPosition = aznew FeaturePosition();
leftFootPosition->SetName("Left Foot Position");
leftFootPosition->SetJointName(settings.m_leftFootJointName);
leftFootPosition->SetRelativeToJointName(rootJointName);
leftFootPosition->SetDebugDrawColor(AZ::Color::CreateFromRgba(255,173,173,255));
leftFootPosition->SetDebugDrawEnabled(true);
featureSchema.AddFeature(leftFootPosition);
//----------------------------------------------------------------------------------------------------------
// Right foot position
FeaturePosition* rightFootPosition = aznew FeaturePosition();
rightFootPosition->SetName("Right Foot Position");
rightFootPosition->SetJointName(settings.m_rightFootJointName);
rightFootPosition->SetRelativeToJointName(rootJointName);
rightFootPosition->SetDebugDrawColor(AZ::Color::CreateFromRgba(253,255,182,255));
rightFootPosition->SetDebugDrawEnabled(true);
featureSchema.AddFeature(rightFootPosition);
//----------------------------------------------------------------------------------------------------------
// Left foot velocity
FeatureVelocity* leftFootVelocity = aznew FeatureVelocity();
leftFootVelocity->SetName("Left Foot Velocity");
leftFootVelocity->SetJointName(settings.m_leftFootJointName);
leftFootVelocity->SetRelativeToJointName(rootJointName);
leftFootVelocity->SetDebugDrawColor(AZ::Color::CreateFromRgba(155,246,255,255));
leftFootVelocity->SetDebugDrawEnabled(true);
leftFootVelocity->SetCostFactor(0.75f);
featureSchema.AddFeature(leftFootVelocity);
//----------------------------------------------------------------------------------------------------------
// Right foot velocity
FeatureVelocity* rightFootVelocity = aznew FeatureVelocity();
rightFootVelocity->SetName("Right Foot Velocity");
rightFootVelocity->SetJointName(settings.m_rightFootJointName);
rightFootVelocity->SetRelativeToJointName(rootJointName);
rightFootVelocity->SetDebugDrawColor(AZ::Color::CreateFromRgba(189,178,255,255));
rightFootVelocity->SetDebugDrawEnabled(true);
rightFootVelocity->SetCostFactor(0.75f);
featureSchema.AddFeature(rightFootVelocity);
//----------------------------------------------------------------------------------------------------------
// Pelvis velocity
FeatureVelocity* pelvisVelocity = aznew FeatureVelocity();
pelvisVelocity->SetName("Pelvis Velocity");
pelvisVelocity->SetJointName(settings.m_pelvisJointName);
pelvisVelocity->SetRelativeToJointName(rootJointName);
pelvisVelocity->SetDebugDrawColor(AZ::Color::CreateFromRgba(185,255,175,255));
pelvisVelocity->SetDebugDrawEnabled(true);
featureSchema.AddFeature(pelvisVelocity);
}
} // namespace EMotionFX::MotionMatching

@ -0,0 +1,23 @@
/*
* Copyright (c) Contributors to the Open 3D Engine Project.
* For complete copyright and license terms please see the LICENSE at the root of this distribution.
*
* SPDX-License-Identifier: Apache-2.0 OR MIT
*
*/
#pragma once
#include <FeatureSchema.h>
namespace EMotionFX::MotionMatching
{
struct DefaultFeatureSchemaInitSettings
{
AZStd::string m_rootJointName;
AZStd::string m_leftFootJointName;
AZStd::string m_rightFootJointName;
AZStd::string m_pelvisJointName;
};
void DefaultFeatureSchema(FeatureSchema& featureSchema, DefaultFeatureSchemaInitSettings settings);
} // namespace EMotionFX::MotionMatching

@ -0,0 +1,450 @@
/*
* Copyright (c) Contributors to the Open 3D Engine Project.
* For complete copyright and license terms please see the LICENSE at the root of this distribution.
*
* SPDX-License-Identifier: Apache-2.0 OR MIT
*
*/
#include <EMotionFX/Source/ActorInstance.h>
#include <Allocators.h>
#include <EMotionFX/Source/AnimGraphPose.h>
#include <EMotionFX/Source/AnimGraphPosePool.h>
#include <EMotionFX/Source/EventManager.h>
#include <MotionMatchingData.h>
#include <MotionMatchingInstance.h>
#include <FrameDatabase.h>
#include <FeatureTrajectory.h>
#include <EMotionFX/Source/Pose.h>
#include <EMotionFX/Source/Transform.h>
#include <EMotionFX/Source/TransformData.h>
#include <AzCore/Serialization/EditContext.h>
#include <AzCore/Serialization/SerializeContext.h>
namespace EMotionFX::MotionMatching
{
AZ_CLASS_ALLOCATOR_IMPL(FeatureTrajectory, MotionMatchAllocator, 0)
bool FeatureTrajectory::Init(const InitSettings& settings)
{
const bool result = Feature::Init(settings);
UpdateFacingAxis();
return result;
}
size_t FeatureTrajectory::CalcNumSamplesPerFrame() const
{
return m_numPastSamples + 1 + m_numFutureSamples;
}
void FeatureTrajectory::SetFacingAxis(const Axis axis)
{
m_facingAxis = axis;
UpdateFacingAxis();
}
void FeatureTrajectory::UpdateFacingAxis()
{
switch (m_facingAxis)
{
case Axis::X:
{
m_facingAxisDir = AZ::Vector3::CreateAxisX();
break;
}
case Axis::Y:
{
m_facingAxisDir = AZ::Vector3::CreateAxisY();
break;
}
case Axis::X_NEGATIVE:
{
m_facingAxisDir = -AZ::Vector3::CreateAxisX();
break;
}
case Axis::Y_NEGATIVE:
{
m_facingAxisDir = -AZ::Vector3::CreateAxisY();
break;
}
default:
{
AZ_Assert(false, "Facing direction axis unknown.");
}
}
}
AZ::Vector2 FeatureTrajectory::CalculateFacingDirection(const Pose& pose, const Transform& invRootTransform) const
{
// Get the facing direction of the given joint for the given pose in animation world space.
// The given pose is either sampled into the relative past or future based on the frame we want to extract the feature for.
const AZ::Vector3 facingDirAnimationWorldSpace = pose.GetWorldSpaceTransform(m_jointIndex).TransformVector(m_facingAxisDir);
// The invRootTransform is the inverse of the world space transform for the given joint at the frame we want to extract the feature for.
// The result after this will be the facing direction relative to the frame we want to extract the feature for.
const AZ::Vector3 facingDirection = invRootTransform.TransformVector(facingDirAnimationWorldSpace);
// Project to the ground plane and make sure the direction is normalized.
return AZ::Vector2(facingDirection).GetNormalizedSafe();
}
FeatureTrajectory::Sample FeatureTrajectory::GetSampleFromPose(const Pose& pose, const Transform& invRootTransform) const
{
// Position of the root joint in the model space relative to frame to extract.
const AZ::Vector2 position = AZ::Vector2(invRootTransform.TransformPoint(pose.GetWorldSpaceTransform(m_jointIndex).m_position));
// Calculate the facing direction.
const AZ::Vector2 facingDirection = CalculateFacingDirection(pose, invRootTransform);
return { position, facingDirection };
}
void FeatureTrajectory::ExtractFeatureValues(const ExtractFeatureContext& context)
{
const ActorInstance* actorInstance = context.m_actorInstance;
AnimGraphPosePool& posePool = GetEMotionFX().GetThreadData(actorInstance->GetThreadIndex())->GetPosePool();
AnimGraphPose* samplePose = posePool.RequestPose(actorInstance);
AnimGraphPose* nextSamplePose = posePool.RequestPose(actorInstance);
const size_t frameIndex = context.m_frameIndex;
const Frame& currentFrame = context.m_frameDatabase->GetFrame(context.m_frameIndex);
// Inverse of the root transform for the frame that we want to extract data from.
const Transform invRootTransform = context.m_framePose->GetWorldSpaceTransform(m_relativeToNodeIndex).Inversed();
const size_t midSampleIndex = CalcMidFrameIndex();
const Sample midSample = GetSampleFromPose(*context.m_framePose, invRootTransform);
SetFeatureData(context.m_featureMatrix, frameIndex, midSampleIndex, midSample);
// Sample the past.
const float pastFrameTimeDelta = m_pastTimeRange / static_cast<float>(m_numPastSamples - 1);
currentFrame.SamplePose(&samplePose->GetPose());
for (size_t i = 0; i < m_numPastSamples; ++i)
{
// Increase the sample index by one as the zeroth past/future sample actually needs one time delta time difference to the current frame.
const float sampleTimeOffset = (i+1) * pastFrameTimeDelta * (-1.0f);
currentFrame.SamplePose(&nextSamplePose->GetPose(), sampleTimeOffset);
const Sample sample = GetSampleFromPose(samplePose->GetPose(), invRootTransform);
const size_t sampleIndex = CalcPastFrameIndex(i);
SetFeatureData(context.m_featureMatrix, frameIndex, sampleIndex, sample);
*samplePose = *nextSamplePose;
}
// Sample into the future.
const float futureFrameTimeDelta = m_futureTimeRange / (float)(m_numFutureSamples - 1);
currentFrame.SamplePose(&samplePose->GetPose());
for (size_t i = 0; i < m_numFutureSamples; ++i)
{
// Sample the value at the future sample point.
const float sampleTimeOffset = (i+1) * futureFrameTimeDelta;
currentFrame.SamplePose(&nextSamplePose->GetPose(), sampleTimeOffset);
const Sample sample = GetSampleFromPose(samplePose->GetPose(), invRootTransform);
const size_t sampleIndex = CalcFutureFrameIndex(i);
SetFeatureData(context.m_featureMatrix, frameIndex, sampleIndex, sample);
*samplePose = *nextSamplePose;
}
posePool.FreePose(samplePose);
posePool.FreePose(nextSamplePose);
}
void FeatureTrajectory::SetPastTimeRange(float timeInSeconds)
{
m_pastTimeRange = timeInSeconds;
}
void FeatureTrajectory::SetFutureTimeRange(float timeInSeconds)
{
m_futureTimeRange = timeInSeconds;
}
void FeatureTrajectory::SetNumPastSamplesPerFrame(size_t numHistorySamples)
{
m_numPastSamples = numHistorySamples;
}
void FeatureTrajectory::SetNumFutureSamplesPerFrame(size_t numFutureSamples)
{
m_numFutureSamples = numFutureSamples;
}
void FeatureTrajectory::DebugDrawFacingDirection(AzFramework::DebugDisplayRequests& debugDisplay,
const AZ::Vector3& positionWorldSpace,
const AZ::Vector3& facingDirectionWorldSpace)
{
const float length = 0.2f;
const float radius = 0.01f;
const AZ::Vector3 facingDirectionTarget = positionWorldSpace + facingDirectionWorldSpace * length;
debugDisplay.DrawSolidCylinder(/*center=*/(facingDirectionTarget + positionWorldSpace) * 0.5f,
/*direction=*/facingDirectionWorldSpace,
radius,
/*height=*/length,
/*drawShaded=*/false);
}
void FeatureTrajectory::DebugDrawFacingDirection(AzFramework::DebugDisplayRequests& debugDisplay,
const Transform& worldSpaceTransform,
const Sample& sample,
const AZ::Vector3& samplePosWorldSpace) const
{
const AZ::Vector3 facingDirectionWorldSpace = worldSpaceTransform.TransformVector(AZ::Vector3(sample.m_facingDirection)).GetNormalizedSafe();
DebugDrawFacingDirection(debugDisplay, samplePosWorldSpace, facingDirectionWorldSpace);
}
void FeatureTrajectory::DebugDrawTrajectory(AzFramework::DebugDisplayRequests& debugDisplay,
MotionMatchingInstance* instance,
size_t frameIndex,
const Transform& worldSpaceTransform,
const AZ::Color& color,
size_t numSamples,
const SplineToFeatureMatrixIndex& splineToFeatureMatrixIndex) const
{
if (frameIndex == InvalidIndex)
{
return;
}
constexpr float markerSize = 0.02f;
const FeatureMatrix& featureMatrix = instance->GetData()->GetFeatureMatrix();
debugDisplay.DepthTestOff();
debugDisplay.SetColor(color);
Sample nextSample;
AZ::Vector3 nextSamplePos;
for (size_t i = 0; i < numSamples - 1; ++i)
{
const Sample currentSample = GetFeatureData(featureMatrix, frameIndex, splineToFeatureMatrixIndex(i));
nextSample = GetFeatureData(featureMatrix, frameIndex, splineToFeatureMatrixIndex(i + 1));
const AZ::Vector3 currentSamplePos = worldSpaceTransform.TransformPoint(AZ::Vector3(currentSample.m_position));
nextSamplePos = worldSpaceTransform.TransformPoint(AZ::Vector3(nextSample.m_position));
// Line between current and next sample.
debugDisplay.DrawSolidCylinder(/*center=*/(nextSamplePos + currentSamplePos) * 0.5f,
/*direction=*/(nextSamplePos - currentSamplePos).GetNormalizedSafe(),
/*radius=*/0.0025f,
/*height=*/(nextSamplePos - currentSamplePos).GetLength(),
/*drawShaded=*/false);
// Sphere at the sample position and a cylinder to indicate the facing direction.
debugDisplay.DrawBall(currentSamplePos, markerSize, /*drawShaded=*/false);
DebugDrawFacingDirection(debugDisplay, worldSpaceTransform, currentSample, currentSamplePos);
}
debugDisplay.DrawBall(nextSamplePos, markerSize, /*drawShaded=*/false);
DebugDrawFacingDirection(debugDisplay, worldSpaceTransform, nextSample, nextSamplePos);
}
void FeatureTrajectory::DebugDraw(AzFramework::DebugDisplayRequests& debugDisplay,
MotionMatchingInstance* instance,
size_t frameIndex)
{
const ActorInstance* actorInstance = instance->GetActorInstance();
const Transform transform = actorInstance->GetTransformData()->GetCurrentPose()->GetWorldSpaceTransform(m_jointIndex);
DebugDrawTrajectory(debugDisplay, instance, frameIndex, transform,
m_debugColor, m_numPastSamples, AZStd::bind(&FeatureTrajectory::CalcPastFrameIndex, this, AZStd::placeholders::_1));
DebugDrawTrajectory(debugDisplay, instance, frameIndex, transform,
m_debugColor, m_numFutureSamples, AZStd::bind(&FeatureTrajectory::CalcFutureFrameIndex, this, AZStd::placeholders::_1));
}
size_t FeatureTrajectory::CalcMidFrameIndex() const
{
return m_numPastSamples;
}
size_t FeatureTrajectory::CalcPastFrameIndex(size_t historyFrameIndex) const
{
AZ_Assert(historyFrameIndex < m_numPastSamples, "The history frame index is out of range");
return m_numPastSamples - historyFrameIndex - 1;
}
size_t FeatureTrajectory::CalcFutureFrameIndex(size_t futureFrameIndex) const
{
AZ_Assert(futureFrameIndex < m_numFutureSamples, "The future frame index is out of range");
return CalcMidFrameIndex() + 1 + futureFrameIndex;
}
float FeatureTrajectory::CalculateCost(const FeatureMatrix& featureMatrix,
size_t frameIndex,
const Transform& invRootTransform,
const AZStd::vector<TrajectoryQuery::ControlPoint>& controlPoints,
const SplineToFeatureMatrixIndex& splineToFeatureMatrixIndex) const
{
float cost = 0.0f;
AZ::Vector2 lastControlPoint, lastSamplePos;
for (size_t i = 0; i < controlPoints.size(); ++i)
{
const TrajectoryQuery::ControlPoint& controlPoint = controlPoints[i];
const Sample sample = GetFeatureData(featureMatrix, frameIndex, splineToFeatureMatrixIndex(i));
const AZ::Vector2& samplePos = sample.m_position;
const AZ::Vector2 controlPointPos = AZ::Vector2(invRootTransform.TransformPoint(controlPoint.m_position)); // Convert so it is relative to where we are and pointing to.
if (i != 0)
{
const AZ::Vector2 controlPointDelta = controlPointPos - lastControlPoint;
const AZ::Vector2 sampleDelta = samplePos - lastSamplePos;
const float posDistance = (samplePos - controlPointPos).GetLength();
const float posDeltaDistance = (controlPointDelta - sampleDelta).GetLength();
// The facing direction from the control point (trajectory query) is in world space while the facing direction from the
// sample of this trajectory feature is in relative-to-frame-root-joint space.
const AZ::Vector2 controlPointFacingDirRelativeSpace = AZ::Vector2(invRootTransform.TransformVector(controlPoint.m_facingDirection));
const float facingDirectionCost = GetNormalizedDirectionDifference(sample.m_facingDirection,
controlPointFacingDirRelativeSpace);
// As we got two different costs for the position, double the cost of the facing direction to equal out the influence.
cost += CalcResidual(posDistance) + CalcResidual(posDeltaDistance) + CalcResidual(facingDirectionCost) * 2.0f;
}
lastControlPoint = controlPointPos;
lastSamplePos = samplePos;
}
return cost;
}
float FeatureTrajectory::CalculateFutureFrameCost(size_t frameIndex, const FrameCostContext& context) const
{
AZ_Assert(context.m_trajectoryQuery->GetFutureControlPoints().size() == m_numFutureSamples, "Number of future control points from the trajectory query does not match the one from the trajectory feature.");
const Transform invRootTransform = context.m_currentPose.GetWorldSpaceTransform(m_relativeToNodeIndex).Inversed();
return CalculateCost(context.m_featureMatrix, frameIndex, invRootTransform, context.m_trajectoryQuery->GetFutureControlPoints(), AZStd::bind(&FeatureTrajectory::CalcFutureFrameIndex, this, AZStd::placeholders::_1));
}
float FeatureTrajectory::CalculatePastFrameCost(size_t frameIndex, const FrameCostContext& context) const
{
AZ_Assert(context.m_trajectoryQuery->GetPastControlPoints().size() == m_numPastSamples, "Number of past control points from the trajectory query does not match the one from the trajectory feature");
const Transform invRootTransform = context.m_currentPose.GetWorldSpaceTransform(m_relativeToNodeIndex).Inversed();
return CalculateCost(context.m_featureMatrix, frameIndex, invRootTransform, context.m_trajectoryQuery->GetPastControlPoints(), AZStd::bind(&FeatureTrajectory::CalcPastFrameIndex, this, AZStd::placeholders::_1));
}
AZ::Crc32 FeatureTrajectory::GetCostFactorVisibility() const
{
return AZ::Edit::PropertyVisibility::Hide;
}
void FeatureTrajectory::Reflect(AZ::ReflectContext* context)
{
AZ::SerializeContext* serializeContext = azrtti_cast<AZ::SerializeContext*>(context);
if (!serializeContext)
{
return;
}
serializeContext->Class<FeatureTrajectory, Feature>()
->Version(2)
->Field("pastTimeRange", &FeatureTrajectory::m_pastTimeRange)
->Field("numPastSamples", &FeatureTrajectory::m_numPastSamples)
->Field("pastCostFactor", &FeatureTrajectory::m_pastCostFactor)
->Field("futureTimeRange", &FeatureTrajectory::m_futureTimeRange)
->Field("numFutureSamples", &FeatureTrajectory::m_numFutureSamples)
->Field("futureCostFactor", &FeatureTrajectory::m_futureCostFactor)
->Field("facingAxis", &FeatureTrajectory::m_facingAxis)
;
AZ::EditContext* editContext = serializeContext->GetEditContext();
if (!editContext)
{
return;
}
editContext->Class<FeatureTrajectory>("FeatureTrajectory", "Matches the joint past and future trajectory.")
->ClassElement(AZ::Edit::ClassElements::EditorData, "")
->Attribute(AZ::Edit::Attributes::AutoExpand, "")
->DataElement(AZ::Edit::UIHandlers::Default, &FeatureTrajectory::m_numPastSamples, "Past Samples", "The number of samples stored per frame for the past trajectory. [Default = 4 samples to represent the trajectory history]")
->Attribute(AZ::Edit::Attributes::Min, 1)
->Attribute(AZ::Edit::Attributes::Max, 100)
->Attribute(AZ::Edit::Attributes::Step, 1)
->DataElement(AZ::Edit::UIHandlers::Default, &FeatureTrajectory::m_pastTimeRange, "Past Time Range", "The time window the samples are distributed along for the trajectory history. [Default = 0.7 seconds]")
->Attribute(AZ::Edit::Attributes::Min, 0.01f)
->Attribute(AZ::Edit::Attributes::Max, 10.0f)
->Attribute(AZ::Edit::Attributes::Step, 0.1f)
->DataElement(AZ::Edit::UIHandlers::Default, &FeatureTrajectory::m_pastCostFactor, "Past Cost Factor", "The cost factor is multiplied with the cost from the trajectory history and can be used to change the influence of the trajectory history match in the motion matching search.")
->Attribute(AZ::Edit::Attributes::Min, 0.0f)
->Attribute(AZ::Edit::Attributes::Max, 100.0f)
->Attribute(AZ::Edit::Attributes::Step, 0.1f)
->DataElement(AZ::Edit::UIHandlers::Default, &FeatureTrajectory::m_numFutureSamples, "Future Samples", "The number of samples stored per frame for the future trajectory. [Default = 6 samples to represent the future trajectory]")
->Attribute(AZ::Edit::Attributes::Min, 1)
->Attribute(AZ::Edit::Attributes::Max, 100)
->Attribute(AZ::Edit::Attributes::Step, 1)
->DataElement(AZ::Edit::UIHandlers::Default, &FeatureTrajectory::m_futureTimeRange, "Future Time Range", "The time window the samples are distributed along for the future trajectory. [Default = 1.2 seconds]")
->Attribute(AZ::Edit::Attributes::Min, 0.01f)
->Attribute(AZ::Edit::Attributes::Max, 10.0f)
->Attribute(AZ::Edit::Attributes::Step, 0.1f)
->DataElement(AZ::Edit::UIHandlers::Default, &FeatureTrajectory::m_futureCostFactor, "Future Cost Factor", "The cost factor is multiplied with the cost from the future trajectory and can be used to change the influence of the future trajectory match in the motion matching search.")
->Attribute(AZ::Edit::Attributes::Min, 0.0f)
->Attribute(AZ::Edit::Attributes::Max, 100.0f)
->Attribute(AZ::Edit::Attributes::Step, 0.1f)
->DataElement(AZ::Edit::UIHandlers::ComboBox, &FeatureTrajectory::m_facingAxis, "Facing Axis", "The facing direction of the character. Which axis of the joint transform is facing forward? [Default = Looking into Y-axis direction]")
->Attribute(AZ::Edit::Attributes::ChangeNotify, &FeatureTrajectory::UpdateFacingAxis)
->EnumAttribute(Axis::X, "X")
->EnumAttribute(Axis::X_NEGATIVE, "-X")
->EnumAttribute(Axis::Y, "Y")
->EnumAttribute(Axis::Y_NEGATIVE, "-Y")
;
}
size_t FeatureTrajectory::GetNumDimensions() const
{
return CalcNumSamplesPerFrame() * Sample::s_componentsPerSample;
}
AZStd::string FeatureTrajectory::GetDimensionName(size_t index) const
{
AZStd::string result = "Trajectory";
const int sampleIndex = aznumeric_cast<int>(index) / aznumeric_cast<int>(Sample::s_componentsPerSample);
const int componentIndex = index % Sample::s_componentsPerSample;
const int midSampleIndex = aznumeric_cast<int>(CalcMidFrameIndex());
if (sampleIndex == midSampleIndex)
{
result += ".Current.";
}
else if (sampleIndex < midSampleIndex)
{
result += AZStd::string::format(".Past%i.", sampleIndex - static_cast<int>(m_numPastSamples));
}
else
{
result += AZStd::string::format(".Future%i.", sampleIndex - static_cast<int>(m_numPastSamples));
}
switch (componentIndex)
{
case 0: { result += "PosX"; break; }
case 1: { result += "PosY"; break; }
case 2: { result += "FacingDirX"; break; }
case 3: { result += "FacingDirY"; break; }
default: { result += Feature::GetDimensionName(index); }
}
return result;
}
FeatureTrajectory::Sample FeatureTrajectory::GetFeatureData(const FeatureMatrix& featureMatrix, size_t frameIndex, size_t sampleIndex) const
{
const size_t columnOffset = m_featureColumnOffset + sampleIndex * Sample::s_componentsPerSample;
return {
/*.m_position =*/ featureMatrix.GetVector2(frameIndex, columnOffset + 0),
/*.m_facingDirection =*/ featureMatrix.GetVector2(frameIndex, columnOffset + 2),
};
}
void FeatureTrajectory::SetFeatureData(FeatureMatrix& featureMatrix, size_t frameIndex, size_t sampleIndex, const Sample& sample)
{
const size_t columnOffset = m_featureColumnOffset + sampleIndex * Sample::s_componentsPerSample;
featureMatrix.SetVector2(frameIndex, columnOffset + 0, sample.m_position);
featureMatrix.SetVector2(frameIndex, columnOffset + 2, sample.m_facingDirection);
}
} // namespace EMotionFX::MotionMatching

@ -0,0 +1,148 @@
/*
* Copyright (c) Contributors to the Open 3D Engine Project.
* For complete copyright and license terms please see the LICENSE at the root of this distribution.
*
* SPDX-License-Identifier: Apache-2.0 OR MIT
*
*/
#pragma once
#include <AzCore/Math/Vector3.h>
#include <AzCore/Memory/Memory.h>
#include <AzCore/RTTI/RTTI.h>
#include <AzCore/RTTI/TypeInfo.h>
#include <AzCore/std/containers/vector.h>
#include <EMotionFX/Source/EMotionFXConfig.h>
#include <EMotionFX/Source/Transform.h>
#include <MotionMatchingInstance.h>
#include <FeatureTrajectory.h>
#include <Feature.h>
namespace AZ
{
class ReflectContext;
}
namespace EMotionFX::MotionMatching
{
class FrameDatabase;
/**
* Matches the root joint past and future trajectory.
* For each frame in the motion database, the position and facing direction relative to the current frame of the joint will be evaluated for a past and future time window.
* The past and future samples together form the trajectory of the current frame within the time window. This basically describes where the character came from to reach the
* current frame and where it will go when continuing to play the animation.
**/
class EMFX_API FeatureTrajectory
: public Feature
{
public:
AZ_RTTI(FeatureTrajectory, "{0451E95B-A452-439A-81ED-3962A06A3992}", Feature)
AZ_CLASS_ALLOCATOR_DECL
enum class Axis
{
X = 0,
Y = 1,
X_NEGATIVE = 2,
Y_NEGATIVE = 3,
};
struct EMFX_API Sample
{
AZ::Vector2 m_position; //! Position in the space relative to the extracted frame.
AZ::Vector2 m_facingDirection; //! Facing direction in the space relative to the extracted frame.
static constexpr size_t s_componentsPerSample = 4;
};
FeatureTrajectory() = default;
~FeatureTrajectory() override = default;
bool Init(const InitSettings& settings) override;
void ExtractFeatureValues(const ExtractFeatureContext& context) override;
void DebugDraw(AzFramework::DebugDisplayRequests& debugDisplay,
MotionMatchingInstance* instance,
size_t frameIndex) override;
float CalculateFutureFrameCost(size_t frameIndex, const FrameCostContext& context) const;
float CalculatePastFrameCost(size_t frameIndex, const FrameCostContext& context) const;
void SetNumPastSamplesPerFrame(size_t numHistorySamples);
void SetNumFutureSamplesPerFrame(size_t numFutureSamples);
void SetPastTimeRange(float timeInSeconds);
void SetFutureTimeRange(float timeInSeconds);
void SetFacingAxis(const Axis axis);
void UpdateFacingAxis();
float GetPastTimeRange() const { return m_pastTimeRange; }
size_t GetNumPastSamples() const { return m_numPastSamples; }
float GetPastCostFactor() const { return m_pastCostFactor; }
float GetFutureTimeRange() const { return m_futureTimeRange; }
size_t GetNumFutureSamples() const { return m_numFutureSamples; }
float GetFutureCostFactor() const { return m_futureCostFactor; }
AZ::Vector2 CalculateFacingDirection(const Pose& pose, const Transform& invRootTransform) const;
AZ::Vector3 GetFacingAxisDir() const { return m_facingAxisDir; }
static void Reflect(AZ::ReflectContext* context);
size_t GetNumDimensions() const override;
AZStd::string GetDimensionName(size_t index) const override;
// Shared helper function to draw a facing direction.
static void DebugDrawFacingDirection(AzFramework::DebugDisplayRequests& debugDisplay,
const AZ::Vector3& positionWorldSpace,
const AZ::Vector3& facingDirectionWorldSpace);
private:
size_t CalcMidFrameIndex() const;
size_t CalcPastFrameIndex(size_t historyFrameIndex) const;
size_t CalcFutureFrameIndex(size_t futureFrameIndex) const;
size_t CalcNumSamplesPerFrame() const;
using SplineToFeatureMatrixIndex = AZStd::function<size_t(size_t)>;
float CalculateCost(const FeatureMatrix& featureMatrix,
size_t frameIndex,
const Transform& invRootTransform,
const AZStd::vector<TrajectoryQuery::ControlPoint>& controlPoints,
const SplineToFeatureMatrixIndex& splineToFeatureMatrixIndex) const;
//! Called for every sample in the past or future range to extract its information.
//! @param[in] pose The sampled pose within the trajectory range [m_pastTimeRange, m_futureTimeRange].
//! @param[in] invRootTransform The inverse of the world space transform of the joint at frame time that the feature is extracted for.
Sample GetSampleFromPose(const Pose& pose, const Transform& invRootTransform) const;
Sample GetFeatureData(const FeatureMatrix& featureMatrix, size_t frameIndex, size_t sampleIndex) const;
void SetFeatureData(FeatureMatrix& featureMatrix, size_t frameIndex, size_t sampleIndex, const Sample& sample);
void DebugDrawTrajectory(AzFramework::DebugDisplayRequests& debugDisplay,
MotionMatchingInstance* instance,
size_t frameIndex,
const Transform& transform,
const AZ::Color& color,
size_t numSamples,
const SplineToFeatureMatrixIndex& splineToFeatureMatrixIndex) const;
void DebugDrawFacingDirection(AzFramework::DebugDisplayRequests& debugDisplay,
const Transform& worldSpaceTransform,
const Sample& sample,
const AZ::Vector3& samplePosWorldSpace) const;
AZ::Crc32 GetCostFactorVisibility() const override;
float m_pastTimeRange = 0.7f; //< The time window the samples are distributed along for the past trajectory.
size_t m_numPastSamples = 4; //< The number of samples stored per frame for the past (history) trajectory.
float m_pastCostFactor = 0.5f; //< Normalized value to weight or scale the future trajectory cost.
float m_futureTimeRange = 1.2f; //< The time window the samples are distributed along for the future trajectory.
size_t m_numFutureSamples = 6; //< The number of samples stored per frame for the future trajectory.
float m_futureCostFactor = 0.75f; //< Normalized value to weight or scale the future trajectory cost.
Axis m_facingAxis = Axis::Y; //< Which axis of the joint transform is facing forward?
AZ::Vector3 m_facingAxisDir = AZ::Vector3::CreateAxisY();
};
} // namespace EMotionFX::MotionMatching

@ -0,0 +1,152 @@
/*
* Copyright (c) Contributors to the Open 3D Engine Project.
* For complete copyright and license terms please see the LICENSE at the root of this distribution.
*
* SPDX-License-Identifier: Apache-2.0 OR MIT
*
*/
#include <EMotionFX/Source/ActorInstance.h>
#include <Allocators.h>
#include <EMotionFX/Source/EMotionFXManager.h>
#include <EMotionFX/Source/EventManager.h>
#include <EMotionFX/Source/TransformData.h>
#include <MotionMatchingData.h>
#include <MotionMatchingInstance.h>
#include <FrameDatabase.h>
#include <FeatureVelocity.h>
#include <PoseDataJointVelocities.h>
#include <AzCore/Serialization/EditContext.h>
#include <AzCore/Serialization/SerializeContext.h>
namespace EMotionFX::MotionMatching
{
AZ_CLASS_ALLOCATOR_IMPL(FeatureVelocity, MotionMatchAllocator, 0)
void FeatureVelocity::FillQueryFeatureValues(size_t startIndex, AZStd::vector<float>& queryFeatureValues, const FrameCostContext& context)
{
PoseDataJointVelocities* velocityPoseData = static_cast<PoseDataJointVelocities*>(context.m_currentPose.GetPoseDataByType(azrtti_typeid<PoseDataJointVelocities>()));
AZ_Assert(velocityPoseData, "Cannot calculate velocity feature cost without joint velocity pose data.");
const AZ::Vector3 currentVelocity = velocityPoseData->GetVelocity(m_jointIndex);
queryFeatureValues[startIndex + 0] = currentVelocity.GetX();
queryFeatureValues[startIndex + 1] = currentVelocity.GetY();
queryFeatureValues[startIndex + 2] = currentVelocity.GetZ();
}
void FeatureVelocity::ExtractFeatureValues(const ExtractFeatureContext& context)
{
AZ::Vector3 velocity;
CalculateVelocity(context.m_actorInstance, m_jointIndex, m_relativeToNodeIndex, context.m_frameDatabase->GetFrame(context.m_frameIndex), velocity);
SetFeatureData(context.m_featureMatrix, context.m_frameIndex, velocity);
}
void FeatureVelocity::DebugDraw(AzFramework::DebugDisplayRequests& debugDisplay,
MotionMatchingInstance* instance,
const AZ::Vector3& velocity,
size_t jointIndex,
size_t relativeToJointIndex,
const AZ::Color& color)
{
const ActorInstance* actorInstance = instance->GetActorInstance();
const Pose* pose = actorInstance->GetTransformData()->GetCurrentPose();
const Transform jointModelTM = pose->GetModelSpaceTransform(jointIndex);
const Transform relativeToWorldTM = pose->GetWorldSpaceTransform(relativeToJointIndex);
const AZ::Vector3 jointPosition = relativeToWorldTM.TransformPoint(jointModelTM.m_position);
const float scale = 0.15f;
const AZ::Vector3 velocityWorldSpace = relativeToWorldTM.TransformVector(velocity * scale);
DebugDrawVelocity(debugDisplay, jointPosition, velocityWorldSpace, color);
}
void FeatureVelocity::DebugDraw(AzFramework::DebugDisplayRequests& debugDisplay,
MotionMatchingInstance* instance,
size_t frameIndex)
{
if (m_jointIndex == InvalidIndex)
{
return;
}
const MotionMatchingData* data = instance->GetData();
const AZ::Vector3 velocity = GetFeatureData(data->GetFeatureMatrix(), frameIndex);
DebugDraw(debugDisplay, instance, velocity, m_jointIndex, m_relativeToNodeIndex, m_debugColor);
}
float FeatureVelocity::CalculateFrameCost(size_t frameIndex, const FrameCostContext& context) const
{
PoseDataJointVelocities* velocityPoseData = static_cast<PoseDataJointVelocities*>(context.m_currentPose.GetPoseDataByType(azrtti_typeid<PoseDataJointVelocities>()));
AZ_Assert(velocityPoseData, "Cannot calculate velocity feature cost without joint velocity pose data.");
const AZ::Vector3 currentVelocity = velocityPoseData->GetVelocity(m_jointIndex);
const AZ::Vector3 frameVelocity = GetFeatureData(context.m_featureMatrix, frameIndex);
// Direction difference
const float directionDifferenceCost = GetNormalizedDirectionDifference(frameVelocity.GetNormalized(), currentVelocity.GetNormalized());
// Speed difference
// TODO: This needs to be normalized later on, else wise it could be that the direction difference is weights
// too heavily or too less compared to what the speed values are
const float speedDifferenceCost = frameVelocity.GetLength() - currentVelocity.GetLength();
return CalcResidual(directionDifferenceCost) + CalcResidual(speedDifferenceCost);
}
void FeatureVelocity::Reflect(AZ::ReflectContext* context)
{
AZ::SerializeContext* serializeContext = azrtti_cast<AZ::SerializeContext*>(context);
if (!serializeContext)
{
return;
}
serializeContext->Class<FeatureVelocity, Feature>()
->Version(1)
;
AZ::EditContext* editContext = serializeContext->GetEditContext();
if (!editContext)
{
return;
}
editContext->Class<FeatureVelocity>("FeatureVelocity", "Matches joint velocities.")
->ClassElement(AZ::Edit::ClassElements::EditorData, "")
->Attribute(AZ::Edit::Attributes::AutoExpand, "")
;
}
size_t FeatureVelocity::GetNumDimensions() const
{
return 3;
}
AZStd::string FeatureVelocity::GetDimensionName(size_t index) const
{
AZStd::string result = m_jointName;
result += '.';
switch (index)
{
case 0: { result += "VelocityX"; break; }
case 1: { result += "VelocityY"; break; }
case 2: { result += "VelocityZ"; break; }
default: { result += Feature::GetDimensionName(index); }
}
return result;
}
AZ::Vector3 FeatureVelocity::GetFeatureData(const FeatureMatrix& featureMatrix, size_t frameIndex) const
{
return featureMatrix.GetVector3(frameIndex, m_featureColumnOffset);
}
void FeatureVelocity::SetFeatureData(FeatureMatrix& featureMatrix, size_t frameIndex, const AZ::Vector3& velocity)
{
featureMatrix.SetVector3(frameIndex, m_featureColumnOffset, velocity);
}
} // namespace EMotionFX::MotionMatching

@ -0,0 +1,64 @@
/*
* Copyright (c) Contributors to the Open 3D Engine Project.
* For complete copyright and license terms please see the LICENSE at the root of this distribution.
*
* SPDX-License-Identifier: Apache-2.0 OR MIT
*
*/
#pragma once
#include <AzCore/Math/Vector3.h>
#include <AzCore/Memory/Memory.h>
#include <AzCore/RTTI/RTTI.h>
#include <AzCore/RTTI/TypeInfo.h>
#include <AzCore/std/containers/vector.h>
#include <EMotionFX/Source/EMotionFXConfig.h>
#include <EMotionFX/Source/Velocity.h>
#include <Feature.h>
namespace AZ
{
class ReflectContext;
}
namespace EMotionFX::MotionMatching
{
class FrameDatabase;
class EMFX_API FeatureVelocity
: public Feature
{
public:
AZ_RTTI(FeatureVelocity, "{DEEA4F0F-CE70-4F16-9136-C2BFDDA29336}", Feature)
AZ_CLASS_ALLOCATOR_DECL
FeatureVelocity() = default;
~FeatureVelocity() override = default;
void ExtractFeatureValues(const ExtractFeatureContext& context) override;
static void DebugDraw(AzFramework::DebugDisplayRequests& debugDisplay,
MotionMatchingInstance* instance,
const AZ::Vector3& velocity, // in world space
size_t jointIndex,
size_t relativeToJointIndex,
const AZ::Color& color);
void DebugDraw(AzFramework::DebugDisplayRequests& debugDisplay,
MotionMatchingInstance* instance,
size_t frameIndex) override;
float CalculateFrameCost(size_t frameIndex, const FrameCostContext& context) const override;
void FillQueryFeatureValues(size_t startIndex, AZStd::vector<float>& queryFeatureValues, const FrameCostContext& context) override;
static void Reflect(AZ::ReflectContext* context);
size_t GetNumDimensions() const override;
AZStd::string GetDimensionName(size_t index) const override;
AZ::Vector3 GetFeatureData(const FeatureMatrix& featureMatrix, size_t frameIndex) const;
void SetFeatureData(FeatureMatrix& featureMatrix, size_t frameIndex, const AZ::Vector3& velocity);
};
} // namespace EMotionFX::MotionMatching

@ -0,0 +1,78 @@
/*
* Copyright (c) Contributors to the Open 3D Engine Project.
* For complete copyright and license terms please see the LICENSE at the root of this distribution.
*
* SPDX-License-Identifier: Apache-2.0 OR MIT
*
*/
#include <Allocators.h>
#include <Frame.h>
#include <EMotionFX/Source/MotionData/MotionData.h>
#include <EMotionFX/Source/TransformData.h>
namespace EMotionFX::MotionMatching
{
AZ_CLASS_ALLOCATOR_IMPL(Frame, MotionMatchAllocator, 0)
Frame::Frame()
: m_frameIndex(InvalidIndex)
, m_sampleTime(0.0f)
, m_sourceMotion(nullptr)
, m_mirrored(false)
{
}
Frame::Frame(size_t frameIndex, Motion* sourceMotion, float sampleTime, bool mirrored)
: m_frameIndex(frameIndex)
, m_sourceMotion(sourceMotion)
, m_sampleTime(sampleTime)
, m_mirrored(mirrored)
{
}
void Frame::SamplePose(Pose* outputPose, float timeOffset) const
{
MotionDataSampleSettings sampleSettings;
sampleSettings.m_actorInstance = outputPose->GetActorInstance();
sampleSettings.m_inPlace = false;
sampleSettings.m_mirror = m_mirrored;
sampleSettings.m_retarget = false;
sampleSettings.m_inputPose = sampleSettings.m_actorInstance->GetTransformData()->GetBindPose();
sampleSettings.m_sampleTime = m_sampleTime + timeOffset;
sampleSettings.m_sampleTime = AZ::GetClamp(m_sampleTime + timeOffset, 0.0f, m_sourceMotion->GetDuration());
m_sourceMotion->SamplePose(outputPose, sampleSettings);
}
void Frame::SetFrameIndex(size_t frameIndex)
{
m_frameIndex = frameIndex;
}
Motion* Frame::GetSourceMotion() const
{
return m_sourceMotion;
}
float Frame::GetSampleTime() const
{
return m_sampleTime;
}
void Frame::SetSourceMotion(Motion* sourceMotion)
{
m_sourceMotion = sourceMotion;
}
void Frame::SetMirrored(bool enabled)
{
m_mirrored = enabled;
}
void Frame::SetSampleTime(float sampleTime)
{
m_sampleTime = sampleTime;
}
} // namespace EMotionFX::MotionMatching

@ -0,0 +1,62 @@
/*
* Copyright (c) Contributors to the Open 3D Engine Project.
* For complete copyright and license terms please see the LICENSE at the root of this distribution.
*
* SPDX-License-Identifier: Apache-2.0 OR MIT
*
*/
#pragma once
#include <AzCore/Memory/Memory.h>
#include <AzCore/RTTI/RTTI.h>
#include <EMotionFX/Source/EMotionFXConfig.h>
#include <EMotionFX/Source/Pose.h>
namespace EMotionFX
{
class Motion;
namespace MotionMatching
{
/**
* A motion matching frame.
* This holds information required in order to extract a given pose in a given motion.
*/
class EMFX_API Frame
{
public:
AZ_RTTI(Frame, "{985BD732-D80E-4898-AB6C-CAB22D88AACD}")
AZ_CLASS_ALLOCATOR_DECL
Frame();
Frame(size_t frameIndex, Motion* sourceMotion, float sampleTime, bool mirrored);
~Frame() = default;
//! Sample the pose for the given frame.
//! @param[in] outputPose The pose used to store the sampled result.
//! @param[in] timeOffset Frames in the frame database are samples with a given sample rate (default = 30 fps).
//! For calculating velocities for example, it is needed to sample a pose close to a frame but not exactly at the frame position.
//! The timeOffset parameter can be used for that and represents the offset in time from the frame sample time in seconds.
//! In case the time offset is 0.0, the pose exactly at the frame position will be sampled.
void SamplePose(Pose* outputPose, float timeOffset = 0.0f) const;
Motion* GetSourceMotion() const;
float GetSampleTime() const;
size_t GetFrameIndex() const { return m_frameIndex; }
bool GetMirrored() const { return m_mirrored; }
void SetSourceMotion(Motion* sourceMotion);
void SetSampleTime(float sampleTime);
void SetFrameIndex(size_t frameIndex);
void SetMirrored(bool enabled);
private:
size_t m_frameIndex = 0; /**< The motion frame index inside the data object. */
float m_sampleTime = 0.0f; /**< The time offset in the original motion. */
Motion* m_sourceMotion = nullptr; /**< The original motion that we sample from to restore the pose. */
bool m_mirrored = false; /**< Is this frame mirrored? */
};
} // namespace MotionMatching
} // namespace EMotionFX

@ -0,0 +1,250 @@
/*
* Copyright (c) Contributors to the Open 3D Engine Project.
* For complete copyright and license terms please see the LICENSE at the root of this distribution.
*
* SPDX-License-Identifier: Apache-2.0 OR MIT
*
*/
#include <EMotionFX/Source/ActorInstance.h>
#include <Allocators.h>
#include <EMotionFX/Source/AnimGraphPose.h>
#include <EMotionFX/Source/AnimGraphPosePool.h>
#include <EMotionFX/Source/EMotionFXManager.h>
#include <EMotionFX/Source/Motion.h>
#include <MotionMatchingData.h>
#include <MotionMatchingInstance.h>
#include <Frame.h>
#include <Feature.h>
#include <FrameDatabase.h>
#include <EventData.h>
#include <EMotionFX/Source/Pose.h>
#include <EMotionFX/Source/TransformData.h>
#include <EMotionFX/Source/MotionEvent.h>
#include <EMotionFX/Source/MotionEventTable.h>
#include <EMotionFX/Source/MotionEventTrack.h>
#include <AzCore/Casting/numeric_cast.h>
#include <AzCore/Serialization/SerializeContext.h>
namespace EMotionFX::MotionMatching
{
AZ_CLASS_ALLOCATOR_IMPL(FrameDatabase, MotionMatchAllocator, 0)
FrameDatabase::FrameDatabase()
{
}
FrameDatabase::~FrameDatabase()
{
Clear();
}
void FrameDatabase::Clear()
{
// Clear the frames.
m_frames.clear();
m_frames.shrink_to_fit();
m_frameIndexByMotion.clear();
// Clear other things.
m_usedMotions.clear();
m_usedMotions.shrink_to_fit();
}
void FrameDatabase::ExtractActiveMotionEventDatas(const Motion* motion, float time, AZStd::vector<EventData*>& activeEventDatas)
{
activeEventDatas.clear();
// Iterate over all motion event tracks and all events inside them.
const MotionEventTable* eventTable = motion->GetEventTable();
const size_t numTracks = eventTable->GetNumTracks();
for (size_t t = 0; t < numTracks; ++t)
{
const MotionEventTrack* track = eventTable->GetTrack(t);
const size_t numEvents = track->GetNumEvents();
for (size_t e = 0; e < numEvents; ++e)
{
const MotionEvent& motionEvent = track->GetEvent(e);
// Only handle range based events and events that include our time value.
if (motionEvent.GetIsTickEvent() ||
motionEvent.GetStartTime() > time ||
motionEvent.GetEndTime() < time)
{
continue;
}
for (auto eventData : motionEvent.GetEventDatas())
{
activeEventDatas.emplace_back(const_cast<EventData*>(eventData.get()));
}
}
}
}
bool FrameDatabase::IsFrameDiscarded(const AZStd::vector<EventData*>& activeEventDatas) const
{
for (const EventData* eventData : activeEventDatas)
{
if (eventData->RTTI_GetType() == azrtti_typeid<DiscardFrameEventData>())
{
return true;
}
}
return false;
}
AZStd::tuple<size_t, size_t> FrameDatabase::ImportFrames(Motion* motion, const FrameImportSettings& settings, bool mirrored)
{
AZ_PROFILE_SCOPE(Animation, "FrameDatabase::ImportFrames");
AZ_Assert(motion, "The motion cannot be a nullptr");
AZ_Assert(settings.m_sampleRate > 0, "The sample rate must be bigger than zero frames per second");
AZ_Assert(settings.m_sampleRate <= 120, "The sample rate must be smaller than 120 frames per second");
size_t numFramesImported = 0;
size_t numFramesDiscarded = 0;
// Calculate the number of frames we might need to import, in worst case.
m_sampleRate = settings.m_sampleRate;
const double timeStep = 1.0 / aznumeric_cast<double>(settings.m_sampleRate);
const size_t worstCaseNumFrames = aznumeric_cast<size_t>(ceil(motion->GetDuration() / timeStep)) + 1;
// Try to pre-allocate memory for the worst case scenario.
if (m_frames.capacity() < m_frames.size() + worstCaseNumFrames)
{
m_frames.reserve(m_frames.size() + worstCaseNumFrames);
}
AZStd::vector<EventData*> activeEvents;
// Iterate over all sample positions in the motion.
const double totalTime = aznumeric_cast<double>(motion->GetDuration());
double curTime = 0.0;
while (curTime <= totalTime)
{
const float floatTime = aznumeric_cast<float>(curTime);
ExtractActiveMotionEventDatas(motion, floatTime, activeEvents);
if (!IsFrameDiscarded(activeEvents))
{
ImportFrame(motion, floatTime, mirrored);
numFramesImported++;
}
else
{
numFramesDiscarded++;
}
curTime += timeStep;
}
// Make sure we include the last frame, if we stepped over it.
if (curTime - timeStep < totalTime - 0.000001)
{
const float floatTime = aznumeric_cast<float>(totalTime);
ExtractActiveMotionEventDatas(motion, floatTime, activeEvents);
if (!IsFrameDiscarded(activeEvents))
{
ImportFrame(motion, floatTime, mirrored);
numFramesImported++;
}
else
{
numFramesDiscarded++;
}
}
// Automatically shrink the frame storage to their minimum size.
if (settings.m_autoShrink)
{
m_frames.shrink_to_fit();
}
// Register the motion.
if (AZStd::find(m_usedMotions.begin(), m_usedMotions.end(), motion) == m_usedMotions.end())
{
m_usedMotions.emplace_back(motion);
}
return { numFramesImported, numFramesDiscarded };
}
void FrameDatabase::ImportFrame(Motion* motion, float timeValue, bool mirrored)
{
m_frames.emplace_back(Frame(m_frames.size(), motion, timeValue, mirrored));
m_frameIndexByMotion[motion].emplace_back(m_frames.back().GetFrameIndex());
}
size_t FrameDatabase::CalcMemoryUsageInBytes() const
{
size_t total = 0;
total += m_frames.capacity() * sizeof(Frame);
total += sizeof(m_frames);
total += m_usedMotions.capacity() * sizeof(const Motion*);
total += sizeof(m_usedMotions);
return total;
}
size_t FrameDatabase::GetNumFrames() const
{
return m_frames.size();
}
size_t FrameDatabase::GetNumUsedMotions() const
{
return m_usedMotions.size();
}
const Motion* FrameDatabase::GetUsedMotion(size_t index) const
{
return m_usedMotions[index];
}
const Frame& FrameDatabase::GetFrame(size_t index) const
{
AZ_Assert(index < m_frames.size(), "Frame index is out of range!");
return m_frames[index];
}
AZStd::vector<Frame>& FrameDatabase::GetFrames()
{
return m_frames;
}
const AZStd::vector<Frame>& FrameDatabase::GetFrames() const
{
return m_frames;
}
const AZStd::vector<const Motion*>& FrameDatabase::GetUsedMotions() const
{
return m_usedMotions;
}
size_t FrameDatabase::FindFrameIndex(Motion* motion, float playtime) const
{
auto iterator = m_frameIndexByMotion.find(motion);
if (iterator == m_frameIndexByMotion.end())
{
return InvalidIndex;
}
const AZStd::vector<size_t>& frameIndices = iterator->second;
for (const size_t frameIndex : frameIndices)
{
const Frame& frame = m_frames[frameIndex];
if (playtime >= frame.GetSampleTime() &&
frameIndex + 1 < m_frames.size() &&
playtime <= m_frames[frameIndex + 1].GetSampleTime())
{
return frameIndex;
}
}
return InvalidIndex;
}
} // namespace EMotionFX::MotionMatching

@ -0,0 +1,86 @@
/*
* Copyright (c) Contributors to the Open 3D Engine Project.
* For complete copyright and license terms please see the LICENSE at the root of this distribution.
*
* SPDX-License-Identifier: Apache-2.0 OR MIT
*
*/
#pragma once
#include <AzCore/Memory/Memory.h>
#include <AzCore/RTTI/RTTI.h>
#include <AzCore/std/containers/unordered_map.h>
#include <AzCore/std/containers/vector.h>
#include <AzCore/std/tuple.h>
#include <EMotionFX/Source/EMotionFXConfig.h>
#include <EventData.h>
#include <Frame.h>
namespace EMotionFX
{
class Motion;
class ActorInstance;
}
namespace EMotionFX::MotionMatching
{
class MotionMatchingInstance;
class MotionMatchEventData;
// The motion matching data.
// This is basically a database of frames (which point to motion objects), together with meta data per frame.
// No actual pose data is stored directly inside this class, just references to the right sample times inside specific motions.
class EMFX_API FrameDatabase
{
public:
AZ_RTTI(FrameDatabase, "{3E5ED4F9-8975-41F2-B665-0086368F0DDA}")
AZ_CLASS_ALLOCATOR_DECL
// The settings used when importing motions into the frame database.
// Used in combination with ImportFrames().
struct EMFX_API FrameImportSettings
{
size_t m_sampleRate = 30; /**< Sample at 30 frames per second on default. */
bool m_autoShrink = true; /**< Automatically shrink the internal frame arrays to their minimum size afterwards. */
};
FrameDatabase();
virtual ~FrameDatabase();
// Main functions.
AZStd::tuple<size_t, size_t> ImportFrames(Motion* motion, const FrameImportSettings& settings, bool mirrored); // Returns the number of imported frames and the number of discarded frames as second element.
void Clear(); // Clear the data, so you can re-initialize it with new data.
// Statistics.
size_t GetNumFrames() const;
size_t GetNumUsedMotions() const;
size_t CalcMemoryUsageInBytes() const;
// Misc.
const Motion* GetUsedMotion(size_t index) const;
const Frame& GetFrame(size_t index) const;
const AZStd::vector<Frame>& GetFrames() const;
AZStd::vector<Frame>& GetFrames();
const AZStd::vector<const Motion*>& GetUsedMotions() const;
size_t GetSampleRate() const { return m_sampleRate; }
/**
* Find the frame index for the given playtime and motion.
* NOTE: This is a slow operation and should not be used by the runtime without visual debugging.
*/
size_t FindFrameIndex(Motion* motion, float playtime) const;
private:
void ImportFrame(Motion* motion, float timeValue, bool mirrored);
bool IsFrameDiscarded(const AZStd::vector<EventData*>& activeEventDatas) const;
void ExtractActiveMotionEventDatas(const Motion* motion, float time, AZStd::vector<EventData*>& activeEventDatas); // Vector will be cleared internally.
private:
AZStd::vector<Frame> m_frames; /**< The collection of frames. Keep in mind these don't hold a pose, but reference to a given frame/time value inside a given motion. */
AZStd::unordered_map<Motion*, AZStd::vector<size_t>> m_frameIndexByMotion;
AZStd::vector<const Motion*> m_usedMotions; /**< The list of used motions. */
size_t m_sampleRate = 0;
};
} // namespace EMotionFX::MotionMatching

@ -0,0 +1,144 @@
/*
* Copyright (c) Contributors to the Open 3D Engine Project.
* For complete copyright and license terms please see the LICENSE at the root of this distribution.
*
* SPDX-License-Identifier: Apache-2.0 OR MIT
*
*/
#ifdef IMGUI_ENABLED
#include <Allocators.h>
#include <ImGuiMonitor.h>
namespace EMotionFX::MotionMatching
{
AZ_CLASS_ALLOCATOR_IMPL(ImGuiMonitor, MotionMatchAllocator, 0)
ImGuiMonitor::ImGuiMonitor()
{
m_performanceStats.m_name = "Performance Statistics";
m_featureCosts.m_name = "Feature Costs";
m_featureCosts.m_histogramContainerCount = 100;
ImGui::ImGuiUpdateListenerBus::Handler::BusConnect();
ImGuiMonitorRequestBus::Handler::BusConnect();
}
ImGuiMonitor::~ImGuiMonitor()
{
ImGui::ImGuiUpdateListenerBus::Handler::BusDisconnect();
ImGuiMonitorRequestBus::Handler::BusDisconnect();
}
void ImGuiMonitor::OnImGuiUpdate()
{
if (!m_performanceStats.m_show && !m_featureCosts.m_show)
{
return;
}
if (ImGui::Begin("Motion Matching"))
{
if (ImGui::CollapsingHeader("Feature Matrix", ImGuiTreeNodeFlags_DefaultOpen | ImGuiTreeNodeFlags_Framed))
{
ImGui::Text("Memory Usage: %.2f MB", m_featureMatrixMemoryUsageInBytes / 1024.0f / 1024.0f);
ImGui::Text("Num Frames: %zu", m_featureMatrixNumFrames);
ImGui::Text("Num Feature Components: %zu", m_featureMatrixNumComponents);
}
if (ImGui::CollapsingHeader("Kd-Tree", ImGuiTreeNodeFlags_DefaultOpen | ImGuiTreeNodeFlags_Framed))
{
ImGui::Text("Memory Usage: %.2f MB", m_kdTreeMemoryUsageInBytes / 1024.0f / 1024.0f);
ImGui::Text("Num Nodes: %zu", m_kdTreeNumNodes);
ImGui::Text("Num Dimensions: %zu", m_kdTreeNumDimensions);
}
m_performanceStats.OnImGuiUpdate();
m_featureCosts.OnImGuiUpdate();
}
}
void ImGuiMonitor::OnImGuiMainMenuUpdate()
{
if (ImGui::BeginMenu("Motion Matching"))
{
ImGui::MenuItem(m_performanceStats.m_name.c_str(), "", &m_performanceStats.m_show);
ImGui::MenuItem(m_featureCosts.m_name.c_str(), "", &m_featureCosts.m_show);
ImGui::EndMenu();
}
}
void ImGuiMonitor::PushPerformanceHistogramValue(const char* performanceMetricName, float value)
{
m_performanceStats.PushHistogramValue(performanceMetricName, value, AZ::Color::CreateFromRgba(229,56,59,255));
}
void ImGuiMonitor::PushCostHistogramValue(const char* costName, float value, const AZ::Color& color)
{
m_featureCosts.PushHistogramValue(costName, value, color);
}
void ImGuiMonitor::HistogramGroup::PushHistogramValue(const char* valueName, float value, const AZ::Color& color)
{
auto iterator = m_histogramIndexByName.find(valueName);
if (iterator != m_histogramIndexByName.end())
{
ImGui::LYImGuiUtils::HistogramContainer& histogramContiner = m_histograms[iterator->second];
histogramContiner.PushValue(value);
histogramContiner.SetBarLineColor(ImColor(color.GetR(), color.GetG(), color.GetB(), color.GetA()));
}
else
{
ImGui::LYImGuiUtils::HistogramContainer newHistogram;
newHistogram.Init(/*histogramName=*/valueName,
/*containerCount=*/m_histogramContainerCount,
/*viewType=*/ImGui::LYImGuiUtils::HistogramContainer::ViewType::Histogram,
/*displayOverlays=*/true,
/*min=*/0.0f,
/*max=*/0.0f);
newHistogram.SetMoveDirection(ImGui::LYImGuiUtils::HistogramContainer::PushRightMoveLeft);
newHistogram.PushValue(value);
m_histogramIndexByName[valueName] = m_histograms.size();
m_histograms.push_back(newHistogram);
}
}
void ImGuiMonitor::HistogramGroup::OnImGuiUpdate()
{
if (!m_show)
{
return;
}
if (ImGui::CollapsingHeader(m_name.c_str(), ImGuiTreeNodeFlags_DefaultOpen | ImGuiTreeNodeFlags_Framed))
{
for (auto& histogram : m_histograms)
{
ImGui::BeginGroup();
{
histogram.Draw(ImGui::GetColumnWidth() - 70, s_histogramHeight);
ImGui::SameLine();
ImGui::PushStyleColor(ImGuiCol_Text, IM_COL32(0,0,0,255));
{
const ImColor color = histogram.GetBarLineColor();
ImGui::PushStyleColor(ImGuiCol_Button, color.Value);
{
const AZStd::string valueString = AZStd::string::format("%.2f", histogram.GetLastValue());
ImGui::Button(valueString.c_str());
}
ImGui::PopStyleColor();
}
ImGui::PopStyleColor();
}
ImGui::EndGroup();
}
}
}
} // namespace EMotionFX::MotionMatching
#endif // IMGUI_ENABLED

@ -0,0 +1,84 @@
/*
* Copyright (c) Contributors to the Open 3D Engine Project.
* For complete copyright and license terms please see the LICENSE at the root of this distribution.
*
* SPDX-License-Identifier: Apache-2.0 OR MIT
*
*/
#pragma once
#ifdef IMGUI_ENABLED
#include <AzCore/Component/EntityId.h>
#include <AzCore/Memory/Memory.h>
#include <AzCore/RTTI/RTTI.h>
#include <AzCore/std/containers/vector.h>
#include <AzCore/std/containers/unordered_map.h>
#include <EMotionFX/Source/EMotionFXConfig.h>
#include <imgui/imgui.h>
#include <ImGuiBus.h>
#include <ImGuiMonitorBus.h>
#include <LYImGuiUtils/HistogramContainer.h>
namespace EMotionFX::MotionMatching
{
class EMFX_API ImGuiMonitor
: public ImGui::ImGuiUpdateListenerBus::Handler
, public ImGuiMonitorRequestBus::Handler
{
public:
AZ_RTTI(ImGuiMonitor, "{BF1B85A4-215C-4E3A-8FD8-CE3233E5C779}")
AZ_CLASS_ALLOCATOR_DECL
ImGuiMonitor();
~ImGuiMonitor();
// ImGui::ImGuiUpdateListenerBus::Handler
void OnImGuiUpdate() override;
void OnImGuiMainMenuUpdate() override;
// ImGuiMonitorRequestBus::Handler
void PushPerformanceHistogramValue(const char* performanceMetricName, float value) override;
void PushCostHistogramValue(const char* costName, float value, const AZ::Color& color) override;
void SetFeatureMatrixMemoryUsage(size_t sizeInBytes) override { m_featureMatrixMemoryUsageInBytes = sizeInBytes; }
void SetFeatureMatrixNumFrames(size_t numFrames) override { m_featureMatrixNumFrames = numFrames; }
void SetFeatureMatrixNumComponents(size_t numFeatureComponents) override { m_featureMatrixNumComponents = numFeatureComponents; }
void SetKdTreeMemoryUsage(size_t sizeInBytes) override { m_kdTreeMemoryUsageInBytes = sizeInBytes; }
void SetKdTreeNumNodes(size_t numNodes) override { m_kdTreeNumNodes = numNodes; }
void SetKdTreeNumDimensions(size_t numDimensions) override { m_kdTreeNumDimensions = numDimensions; }
private:
//! Named and sub-divided group containing several histograms.
struct HistogramGroup
{
void OnImGuiUpdate();
void PushHistogramValue(const char* valueName, float value, const AZ::Color& color);
bool m_show = true;
AZStd::string m_name;
using HistogramIndexByNames = AZStd::unordered_map<const char*, size_t>;
HistogramIndexByNames m_histogramIndexByName;
AZStd::vector<ImGui::LYImGuiUtils::HistogramContainer> m_histograms;
int m_histogramContainerCount = 500;
static constexpr float s_histogramHeight = 95.0f;
};
HistogramGroup m_performanceStats;
HistogramGroup m_featureCosts;
size_t m_featureMatrixMemoryUsageInBytes = 0;
size_t m_featureMatrixNumFrames = 0;
size_t m_featureMatrixNumComponents = 0;
size_t m_kdTreeMemoryUsageInBytes = 0;
size_t m_kdTreeNumNodes = 0;
size_t m_kdTreeNumDimensions = 0;
};
} // namespace EMotionFX::MotionMatching
#endif // IMGUI_ENABLED

@ -0,0 +1,37 @@
/*
* Copyright (c) Contributors to the Open 3D Engine Project.
* For complete copyright and license terms please see the LICENSE at the root of this distribution.
*
* SPDX-License-Identifier: Apache-2.0 OR MIT
*
*/
#pragma once
#include <AzCore/Math/Color.h>
#include <AzCore/Memory/Memory.h>
#include <AzCore/RTTI/RTTI.h>
#include <AzCore/EBus/EBus.h>
namespace EMotionFX::MotionMatching
{
class ImGuiMonitorRequests
: public AZ::EBusTraits
{
public:
// Enable multi-threaded access by locking primitive using a mutex when connecting handlers to the EBus or executing events.
using MutexType = AZStd::recursive_mutex;
virtual void PushPerformanceHistogramValue(const char* performanceMetricName, float value) = 0;
virtual void PushCostHistogramValue(const char* costName, float value, const AZ::Color& color) = 0;
virtual void SetFeatureMatrixMemoryUsage(size_t sizeInBytes) = 0;
virtual void SetFeatureMatrixNumFrames(size_t numFrames) = 0;
virtual void SetFeatureMatrixNumComponents(size_t numFeatureComponents) = 0;
virtual void SetKdTreeMemoryUsage(size_t sizeInBytes) = 0;
virtual void SetKdTreeNumNodes(size_t numNodes) = 0;
virtual void SetKdTreeNumDimensions(size_t numDimensions) = 0;
};
using ImGuiMonitorRequestBus = AZ::EBus<ImGuiMonitorRequests>;
} // namespace EMotionFX::MotionMatching

@ -0,0 +1,454 @@
/*
* Copyright (c) Contributors to the Open 3D Engine Project.
* For complete copyright and license terms please see the LICENSE at the root of this distribution.
*
* SPDX-License-Identifier: Apache-2.0 OR MIT
*
*/
#include <KdTree.h>
#include <Feature.h>
#include <Allocators.h>
#include <AzCore/Debug/Timer.h>
namespace EMotionFX::MotionMatching
{
AZ_CLASS_ALLOCATOR_IMPL(KdTree, MotionMatchAllocator, 0)
KdTree::~KdTree()
{
Clear();
}
size_t KdTree::CalcNumDimensions(const AZStd::vector<Feature*>& features)
{
size_t result = 0;
for (Feature* feature : features)
{
if (feature->GetId().IsNull())
{
continue;
}
result += feature->GetNumDimensions();
}
return result;
}
bool KdTree::Init(const FrameDatabase& frameDatabase,
const FeatureMatrix& featureMatrix,
const AZStd::vector<Feature*>& features,
size_t maxDepth,
size_t minFramesPerLeaf)
{
AZ::Debug::Timer timer;
timer.Stamp();
Clear();
// Verify the dimensions.
// Going above a 20 dimensional tree would start eating up too much memory.
m_numDimensions = CalcNumDimensions(features);
if (m_numDimensions == 0 || m_numDimensions > 20)
{
AZ_Error("Motion Matching", false, "Cannot initialize KD-tree. KD-tree dimension (%d) has to be between 1 and 20. Please use Feature::SetIncludeInKdTree(false) on some features.", m_numDimensions);
return false;
}
if (minFramesPerLeaf > 100000)
{
AZ_Error("Motion Matching", false, "KdTree minFramesPerLeaf (%d) cannot be smaller than 100000.", minFramesPerLeaf);
return false;
}
if (maxDepth == 0)
{
AZ_Error("Motion Matching", false, "KdTree max depth (%d) cannot be zero", maxDepth);
return false;
}
m_maxDepth = maxDepth;
m_minFramesPerLeaf = minFramesPerLeaf;
// Build the tree.
m_featureValues.resize(m_numDimensions);
BuildTreeNodes(frameDatabase, featureMatrix, features, new Node(), nullptr, 0);
MergeSmallLeafNodesToParents();
ClearFramesForNonEssentialNodes();
RemoveZeroFrameLeafNodes();
const float initTime = timer.GetDeltaTimeInSeconds();
AZ_TracePrintf("EMotionFX", "KdTree initialized in %f seconds (numNodes = %d numDims = %d Memory used = %.2f MB).",
initTime, m_nodes.size(),
m_numDimensions,
static_cast<float>(CalcMemoryUsageInBytes()) / 1024.0f / 1024.0f);
PrintStats();
return true;
}
void KdTree::Clear()
{
// delete all nodes
for (Node* node : m_nodes)
{
delete node;
}
m_nodes.clear();
m_featureValues.clear();
m_numDimensions = 0;
}
size_t KdTree::CalcMemoryUsageInBytes() const
{
size_t totalBytes = 0;
for (const Node* node : m_nodes)
{
totalBytes += sizeof(Node);
totalBytes += node->m_frames.capacity() * sizeof(size_t);
}
totalBytes += m_featureValues.capacity() * sizeof(float);
totalBytes += sizeof(KdTree);
return totalBytes;
}
bool KdTree::IsInitialized() const
{
return (m_numDimensions != 0);
}
size_t KdTree::GetNumNodes() const
{
return m_nodes.size();
}
size_t KdTree::GetNumDimensions() const
{
return m_numDimensions;
}
void KdTree::BuildTreeNodes(const FrameDatabase& frameDatabase,
const FeatureMatrix& featureMatrix,
const AZStd::vector<Feature*>& features,
Node* node,
Node* parent,
size_t dimension,
bool leftSide)
{
node->m_parent = parent;
node->m_dimension = dimension;
m_nodes.emplace_back(node);
// Fill the frames array and calculate the median.
FillFramesForNode(node, frameDatabase, featureMatrix, features, parent, leftSide);
// Prevent splitting further when we don't want to.
const size_t maxDimensions = AZ::GetMin(m_numDimensions, m_maxDepth);
if (node->m_frames.size() < m_minFramesPerLeaf * 2 ||
dimension >= maxDimensions)
{
return;
}
// Create the left node.
Node* leftNode = new Node();
AZ_Assert(!node->m_leftNode, "Expected the parent left node to be a nullptr");
node->m_leftNode = leftNode;
BuildTreeNodes(frameDatabase, featureMatrix, features, leftNode, node, dimension + 1, true);
// Create the right node.
Node* rightNode = new Node();
AZ_Assert(!node->m_rightNode, "Expected the parent right node to be a nullptr");
node->m_rightNode = rightNode;
BuildTreeNodes(frameDatabase, featureMatrix, features, rightNode, node, dimension + 1, false);
}
void KdTree::ClearFramesForNonEssentialNodes()
{
for (Node* node : m_nodes)
{
if (node->m_leftNode && node->m_rightNode)
{
node->m_frames.clear();
node->m_frames.shrink_to_fit();
}
}
}
void KdTree::RemoveLeafNode(Node* node)
{
Node* parent = node->m_parent;
if (parent->m_leftNode == node)
{
parent->m_leftNode = nullptr;
}
if (parent->m_rightNode == node)
{
parent->m_rightNode = nullptr;
}
// Remove it from the node vector.
const auto location = AZStd::find(m_nodes.begin(), m_nodes.end(), node);
AZ_Assert(location != m_nodes.end(), "Expected to find the item to remove.");
m_nodes.erase(location);
delete node;
}
void KdTree::MergeSmallLeafNodesToParents()
{
AZStd::vector<Node*> nodesToRemove;
for (Node* node : m_nodes)
{
// If we are a leaf node and we don't have enough frames.
if ((!node->m_leftNode && !node->m_rightNode) &&
node->m_frames.size() < m_minFramesPerLeaf)
{
nodesToRemove.emplace_back(node);
}
}
// Remove the actual nodes.
for (Node* node : nodesToRemove)
{
RemoveLeafNode(node);
}
}
void KdTree::RemoveZeroFrameLeafNodes()
{
AZStd::vector<Node*> nodesToRemove;
// Build a list of leaf nodes to remove.
// These are ones that have no feature inside them.
for (Node* node : m_nodes)
{
if ((!node->m_leftNode && !node->m_rightNode) &&
node->m_frames.empty())
{
nodesToRemove.emplace_back(node);
}
}
// Remove the actual nodes.
for (Node* node : nodesToRemove)
{
RemoveLeafNode(node);
}
}
void KdTree::FillFramesForNode(Node* node,
const FrameDatabase& frameDatabase,
const FeatureMatrix& featureMatrix,
const AZStd::vector<Feature*>& features,
Node* parent,
bool leftSide)
{
float median = 0.0f;
if (parent)
{
// Assume half of the parent frames are in this node.
node->m_frames.reserve((parent->m_frames.size() / 2) + 1);
// Add parent frames to this node, but only ones that should be on this side.
for (const size_t frameIndex : parent->m_frames)
{
FillFeatureValues(featureMatrix, features, frameIndex);
const float value = m_featureValues[parent->m_dimension];
if (leftSide)
{
if (value <= parent->m_median)
{
node->m_frames.emplace_back(frameIndex);
}
}
else
{
if (value > parent->m_median)
{
node->m_frames.emplace_back(frameIndex);
}
}
median += value;
}
}
else // We're the root node.
{
node->m_frames.reserve(frameDatabase.GetNumFrames());
for (const Frame& frame : frameDatabase.GetFrames())
{
const size_t frameIndex = frame.GetFrameIndex();
node->m_frames.emplace_back(frameIndex);
FillFeatureValues(featureMatrix, features, frameIndex);
median += m_featureValues[node->m_dimension];
}
}
if (!node->m_frames.empty())
{
median /= static_cast<float>(node->m_frames.size());
}
node->m_median = median;
}
void KdTree::FillFeatureValues(const FeatureMatrix& featureMatrix, const Feature* feature, size_t frameIndex, size_t startIndex)
{
const size_t numDimensions = feature->GetNumDimensions();
const size_t featureColumnOffset = feature->GetColumnOffset();
for (size_t i = 0; i < numDimensions; ++i)
{
m_featureValues[startIndex + i] = featureMatrix(frameIndex, featureColumnOffset + i);
}
}
void KdTree::FillFeatureValues(const FeatureMatrix& featureMatrix, const AZStd::vector<Feature*>& features, size_t frameIndex)
{
size_t startDimension = 0;
for (const Feature* feature : features)
{
FillFeatureValues(featureMatrix, feature, frameIndex, startDimension);
startDimension += feature->GetNumDimensions();
}
}
void KdTree::RecursiveCalcNumFrames(Node* node, size_t& outNumFrames) const
{
if (node->m_leftNode && node->m_rightNode)
{
RecursiveCalcNumFrames(node->m_leftNode, outNumFrames);
RecursiveCalcNumFrames(node->m_rightNode, outNumFrames);
}
else
{
outNumFrames += node->m_frames.size();
}
}
void KdTree::PrintStats()
{
size_t leftNumFrames = 0;
size_t rightNumFrames = 0;
if (m_nodes[0]->m_leftNode)
{
RecursiveCalcNumFrames(m_nodes[0]->m_leftNode, leftNumFrames);
}
if (m_nodes[0]->m_rightNode)
{
RecursiveCalcNumFrames(m_nodes[0]->m_rightNode, rightNumFrames);
}
const float numFrames = static_cast<float>(leftNumFrames + rightNumFrames);
const float halfFrames = numFrames / 2.0f;
const float balanceScore = 100.0f - (AZ::GetAbs(halfFrames - static_cast<float>(leftNumFrames)) / numFrames) * 100.0f;
// Get the maximum depth.
size_t maxDepth = 0;
for (const Node* node : m_nodes)
{
maxDepth = AZ::GetMax(maxDepth, node->m_dimension);
}
AZ_TracePrintf("EMotionFX", "KdTree Balance Info: leftSide=%d rightSide=%d score=%.2f totalFrames=%d maxDepth=%d", leftNumFrames, rightNumFrames, balanceScore, leftNumFrames + rightNumFrames, maxDepth);
size_t numLeafNodes = 0;
size_t numZeroNodes = 0;
size_t minFrames = 1000000000;
size_t maxFrames = 0;
for (const Node* node : m_nodes)
{
if (node->m_leftNode || node->m_rightNode)
{
continue;
}
numLeafNodes++;
if (node->m_frames.empty())
{
numZeroNodes++;
}
AZ_TracePrintf("EMotionFX", "Frames = %d", node->m_frames.size());
minFrames = AZ::GetMin(minFrames, node->m_frames.size());
maxFrames = AZ::GetMax(maxFrames, node->m_frames.size());
}
const size_t avgFrames = (leftNumFrames + rightNumFrames) / numLeafNodes;
AZ_TracePrintf("EMotionFX", "KdTree Node Info: leafs=%d avgFrames=%d zeroFrames=%d minFrames=%d maxFrames=%d", numLeafNodes, avgFrames, numZeroNodes, minFrames, maxFrames);
}
void KdTree::FindNearestNeighbors(const AZStd::vector<float>& frameFloats, AZStd::vector<size_t>& resultFrameIndices) const
{
AZ_Assert(IsInitialized() && !m_nodes.empty(), "Expecting a valid and initialized kdTree. Did you forget to call KdTree::Init()?");
Node* curNode = m_nodes[0];
// Step as far as we need to through the kdTree.
Node* nodeToSearch = nullptr;
const size_t numDimensions = frameFloats.size();
for (size_t d = 0; d < numDimensions; ++d)
{
AZ_Assert(curNode->m_dimension == d, "Dimension mismatch");
// We have children in both directions.
if (curNode->m_leftNode && curNode->m_rightNode)
{
curNode = (frameFloats[d] <= curNode->m_median) ? curNode->m_leftNode : curNode->m_rightNode;
}
else if (!curNode->m_leftNode && !curNode->m_rightNode) // we have a leaf node
{
nodeToSearch = curNode;
}
else
{
// We have both a left and right node, so we're not at a leaf yet.
if (curNode->m_leftNode)
{
if (frameFloats[d] <= curNode->m_median)
{
curNode = curNode->m_leftNode;
}
else
{
nodeToSearch = curNode;
}
}
else // We have a right node.
{
if (frameFloats[d] > curNode->m_median)
{
curNode = curNode->m_rightNode;
}
else
{
nodeToSearch = curNode;
}
}
}
// If we found our search node, perform a linear search through the frames inside this node.
if (nodeToSearch)
{
//AZ_Assert(d == nodeToSearch->m_dimension, "Dimension mismatch inside kdTree nearest neighbor search.");
FindNearestNeighbors(nodeToSearch, frameFloats, resultFrameIndices);
return;
}
}
FindNearestNeighbors(curNode, frameFloats, resultFrameIndices);
}
void KdTree::FindNearestNeighbors([[maybe_unused]] Node* node, [[maybe_unused]] const AZStd::vector<float>& frameFloats, AZStd::vector<size_t>& resultFrameIndices) const
{
resultFrameIndices = node->m_frames;
}
} // namespace EMotionFX::MotionMatching

@ -0,0 +1,95 @@
/*
* Copyright (c) Contributors to the Open 3D Engine Project.
* For complete copyright and license terms please see the LICENSE at the root of this distribution.
*
* SPDX-License-Identifier: Apache-2.0 OR MIT
*
*/
#pragma once
#include <AzCore/Memory/Memory.h>
#include <AzCore/std/containers/vector.h>
#include <EMotionFX/Source/EMotionFXConfig.h>
#include <Feature.h>
#include <FeatureMatrix.h>
#include <FrameDatabase.h>
namespace EMotionFX::MotionMatching
{
class KdTree
{
public:
AZ_RTTI(KdTree, "{CDA707EC-4150-463B-8157-90D98351ACED}")
AZ_CLASS_ALLOCATOR_DECL
KdTree() = default;
virtual ~KdTree();
bool Init(const FrameDatabase& frameDatabase,
const FeatureMatrix& featureMatrix,
const AZStd::vector<Feature*>& features,
size_t maxDepth=10,
size_t minFramesPerLeaf=1000);
/**
* Calculate the number of dimensions or values for the given feature set.
* Each feature might store one or multiple values inside the feature matrix and the number of
* values each feature holds varies with the feature type. This calculates the sum of the number of
* values of the given feature set.
*/
static size_t CalcNumDimensions(const AZStd::vector<Feature*>& features);
void Clear();
void PrintStats();
size_t GetNumNodes() const;
size_t GetNumDimensions() const;
size_t CalcMemoryUsageInBytes() const;
bool IsInitialized() const;
void FindNearestNeighbors(const AZStd::vector<float>& frameFloats, AZStd::vector<size_t>& resultFrameIndices) const;
private:
struct Node
{
Node* m_leftNode = nullptr;
Node* m_rightNode = nullptr;
Node* m_parent = nullptr;
float m_median = 0.0f;
size_t m_dimension = 0;
AZStd::vector<size_t> m_frames;
};
void BuildTreeNodes(const FrameDatabase& frameDatabase,
const FeatureMatrix& featureMatrix,
const AZStd::vector<Feature*>& features,
Node* node,
Node* parent,
size_t dimension = 0,
bool leftSide = true);
void FillFeatureValues(const FeatureMatrix& featureMatrix, const Feature* feature, size_t frameIndex, size_t startIndex);
void FillFeatureValues(const FeatureMatrix& featureMatrix, const AZStd::vector<Feature*>& features, size_t frameIndex);
void FillFramesForNode(Node* node,
const FrameDatabase& frameDatabase,
const FeatureMatrix& featureMatrix,
const AZStd::vector<Feature*>& features,
Node* parent,
bool leftSide);
void RecursiveCalcNumFrames(Node* node, size_t& outNumFrames) const;
void ClearFramesForNonEssentialNodes();
void MergeSmallLeafNodesToParents();
void RemoveZeroFrameLeafNodes();
void RemoveLeafNode(Node* node);
void FindNearestNeighbors(Node* node, const AZStd::vector<float>& frameFloats, AZStd::vector<size_t>& resultFrameIndices) const;
private:
AZStd::vector<Node*> m_nodes;
AZStd::vector<float> m_featureValues;
size_t m_numDimensions = 0;
size_t m_maxDepth = 20;
size_t m_minFramesPerLeaf = 1000;
};
} // namespace EMotionFX::MotionMatching

@ -0,0 +1,181 @@
/*
* Copyright (c) Contributors to the Open 3D Engine Project.
* For complete copyright and license terms please see the LICENSE at the root of this distribution.
*
* SPDX-License-Identifier: Apache-2.0 OR MIT
*
*/
#include <AzCore/Debug/Timer.h>
#include <AzCore/Component/ComponentApplicationBus.h>
#include <AzCore/Serialization/EditContext.h>
#include <AzCore/Serialization/SerializeContext.h>
#include <EMotionFX/Source/ActorInstance.h>
#include <EMotionFX/Source/AnimGraphPose.h>
#include <EMotionFX/Source/Motion.h>
#include <Allocators.h>
#include <Feature.h>
#include <FeatureSchemaDefault.h>
#include <FeatureTrajectory.h>
#include <FrameDatabase.h>
#include <KdTree.h>
#include <MotionMatchingData.h>
namespace EMotionFX::MotionMatching
{
AZ_CLASS_ALLOCATOR_IMPL(MotionMatchingData, MotionMatchAllocator, 0)
MotionMatchingData::MotionMatchingData(const FeatureSchema& featureSchema)
: m_featureSchema(featureSchema)
{
m_kdTree = AZStd::make_unique<KdTree>();
}
MotionMatchingData::~MotionMatchingData()
{
Clear();
}
bool MotionMatchingData::ExtractFeatures(ActorInstance* actorInstance, FrameDatabase* frameDatabase, size_t maxKdTreeDepth, size_t minFramesPerKdTreeNode)
{
AZ_PROFILE_SCOPE(Animation, "MotionMatchingData::ExtractFeatures");
AZ::Debug::Timer timer;
timer.Stamp();
const size_t numFrames = frameDatabase->GetNumFrames();
if (numFrames == 0)
{
return true;
}
// Initialize all features before we process each frame.
FeatureMatrix::Index featureComponentCount = 0;
for (Feature* feature : m_featureSchema.GetFeatures())
{
Feature::InitSettings frameSettings;
frameSettings.m_actorInstance = actorInstance;
if (!feature->Init(frameSettings))
{
return false;
}
feature->SetColumnOffset(featureComponentCount);
featureComponentCount += feature->GetNumDimensions();
}
const auto& frames = frameDatabase->GetFrames();
// Allocate memory for the feature matrix
m_featureMatrix.resize(/*rows=*/numFrames, /*columns=*/featureComponentCount);
// Iterate over all frames and extract the data for this frame.
AnimGraphPosePool& posePool = GetEMotionFX().GetThreadData(actorInstance->GetThreadIndex())->GetPosePool();
AnimGraphPose* pose = posePool.RequestPose(actorInstance);
Feature::ExtractFeatureContext context(m_featureMatrix);
context.m_frameDatabase = frameDatabase;
context.m_framePose = &pose->GetPose();
context.m_actorInstance = actorInstance;
for (const Frame& frame : frames)
{
context.m_frameIndex = frame.GetFrameIndex();
// Pre-sample the frame pose as that will be needed by many of the feature extraction calculations.
frame.SamplePose(const_cast<Pose*>(context.m_framePose));
// Extract all features for the given frame.
{
for (Feature* feature : m_featureSchema.GetFeatures())
{
feature->ExtractFeatureValues(context);
}
}
}
posePool.FreePose(pose);
const float extractFeaturesTime = timer.GetDeltaTimeInSeconds();
timer.Stamp();
// Initialize the kd-tree used to accelerate the searches.
if (!m_kdTree->Init(*frameDatabase, m_featureMatrix, m_featuresInKdTree, maxKdTreeDepth, minFramesPerKdTreeNode)) // Internally automatically clears any existing contents.
{
AZ_Error("EMotionFX", false, "Failed to initialize KdTree acceleration structure.");
return false;
}
const float initKdTreeTimer = timer.GetDeltaTimeInSeconds();
AZ_Printf("MotionMatching", "Feature matrix (%zu, %zu) uses %.2f MB and took %.2f ms to initialize (KD-Tree %.2f ms).",
m_featureMatrix.rows(),
m_featureMatrix.cols(),
static_cast<float>(m_featureMatrix.CalcMemoryUsageInBytes()) / 1024.0f / 1024.0f,
extractFeaturesTime * 1000.0f,
initKdTreeTimer * 1000.0f);
return true;
}
bool MotionMatchingData::Init(const InitSettings& settings)
{
AZ_PROFILE_SCOPE(Animation, "MotionMatchingData::Init");
// Import all motion frames.
size_t totalNumFramesImported = 0;
size_t totalNumFramesDiscarded = 0;
for (Motion* motion : settings.m_motionList)
{
size_t numFrames = 0;
size_t numDiscarded = 0;
std::tie(numFrames, numDiscarded) = m_frameDatabase.ImportFrames(motion, settings.m_frameImportSettings, false);
totalNumFramesImported += numFrames;
totalNumFramesDiscarded += numDiscarded;
if (settings.m_importMirrored)
{
std::tie(numFrames, numDiscarded) = m_frameDatabase.ImportFrames(motion, settings.m_frameImportSettings, true);
totalNumFramesImported += numFrames;
totalNumFramesDiscarded += numDiscarded;
}
}
if (totalNumFramesImported > 0 || totalNumFramesDiscarded > 0)
{
AZ_TracePrintf("Motion Matching", "Imported a total of %d frames (%d frames discarded) across %d motions. This is %.2f seconds (%.2f minutes) of motion data.",
totalNumFramesImported,
totalNumFramesDiscarded,
settings.m_motionList.size(),
totalNumFramesImported / (float)settings.m_frameImportSettings.m_sampleRate,
(totalNumFramesImported / (float)settings.m_frameImportSettings.m_sampleRate) / 60.0f);
}
// Use all features other than the trajectory for the broad-phase search using the KD-Tree.
for (Feature* feature : m_featureSchema.GetFeatures())
{
if (feature->RTTI_GetType() != azrtti_typeid<FeatureTrajectory>())
{
m_featuresInKdTree.push_back(feature);
}
}
// Extract feature data and place the values into the feature matrix.
if (!ExtractFeatures(settings.m_actorInstance, &m_frameDatabase, settings.m_maxKdTreeDepth, settings.m_minFramesPerKdTreeNode))
{
AZ_Error("Motion Matching", false, "Failed to extract features from motion database.");
return false;
}
return true;
}
void MotionMatchingData::Clear()
{
m_frameDatabase.Clear();
m_featureMatrix.Clear();
m_kdTree->Clear();
m_featuresInKdTree.clear();
}
} // namespace EMotionFX::MotionMatching

@ -0,0 +1,74 @@
/*
* Copyright (c) Contributors to the Open 3D Engine Project.
* For complete copyright and license terms please see the LICENSE at the root of this distribution.
*
* SPDX-License-Identifier: Apache-2.0 OR MIT
*
*/
#pragma once
#include <AzCore/Memory/Memory.h>
#include <AzCore/RTTI/RTTI.h>
#include <AzCore/std/containers/vector.h>
#include <EMotionFX/Source/EMotionFXConfig.h>
#include <Feature.h>
#include <FeatureSchema.h>
#include <FrameDatabase.h>
#include <KdTree.h>
namespace AZ
{
class ReflectContext;
}
namespace EMotionFX
{
class ActorInstance;
}
namespace EMotionFX::MotionMatching
{
class EMFX_API MotionMatchingData
{
public:
AZ_RTTI(MotionMatchingData, "{7BC3DFF5-8864-4518-B6F0-0553ADFAB5C1}")
AZ_CLASS_ALLOCATOR_DECL
MotionMatchingData(const FeatureSchema& featureSchema);
virtual ~MotionMatchingData();
struct EMFX_API InitSettings
{
ActorInstance* m_actorInstance = nullptr;
AZStd::vector<Motion*> m_motionList;
FrameDatabase::FrameImportSettings m_frameImportSettings;
size_t m_maxKdTreeDepth = 20;
size_t m_minFramesPerKdTreeNode = 1000;
bool m_importMirrored = false;
};
bool Init(const InitSettings& settings);
void Clear();
const FrameDatabase& GetFrameDatabase() const { return m_frameDatabase; }
FrameDatabase& GetFrameDatabase() { return m_frameDatabase; }
const FeatureSchema& GetFeatureSchema() const { return m_featureSchema; }
const FeatureMatrix& GetFeatureMatrix() const { return m_featureMatrix; }
const KdTree& GetKdTree() const { return *m_kdTree.get(); }
const AZStd::vector<Feature*>& GetFeaturesInKdTree() const { return m_featuresInKdTree; }
protected:
bool ExtractFeatures(ActorInstance* actorInstance, FrameDatabase* frameDatabase, size_t maxKdTreeDepth=20, size_t minFramesPerKdTreeNode=2000);
FrameDatabase m_frameDatabase; /**< The animation database with all the keyframes and joint transform data. */
const FeatureSchema& m_featureSchema;
FeatureMatrix m_featureMatrix;
AZStd::unique_ptr<KdTree> m_kdTree; /**< The acceleration structure to speed up the search for lowest cost frames. */
AZStd::vector<Feature*> m_featuresInKdTree;
};
} // namespace EMotionFX::MotionMatching

@ -0,0 +1,40 @@
/*
* Copyright (c) Contributors to the Open 3D Engine Project.
* For complete copyright and license terms please see the LICENSE at the root of this distribution.
*
* SPDX-License-Identifier: Apache-2.0 OR MIT
*
*/
#include <MotionMatchingModuleInterface.h>
#include <MotionMatchingEditorSystemComponent.h>
namespace EMotionFX::MotionMatching
{
class MotionMatchingEditorModule
: public MotionMatchingModuleInterface
{
public:
AZ_RTTI(MotionMatchingEditorModule, "{cf4381d1-0207-4ef8-85f0-6c88ec28a7b6}", MotionMatchingModuleInterface);
AZ_CLASS_ALLOCATOR(MotionMatchingEditorModule, AZ::SystemAllocator, 0);
MotionMatchingEditorModule()
{
m_descriptors.insert(m_descriptors.end(),
{
MotionMatchingEditorSystemComponent::CreateDescriptor(),
});
}
/// Add required SystemComponents to the SystemEntity. Non-SystemComponents should not be added here.
AZ::ComponentTypeList GetRequiredSystemComponents() const override
{
return AZ::ComponentTypeList
{
azrtti_typeid<MotionMatchingEditorSystemComponent>(),
};
}
};
}// namespace EMotionFX::MotionMatching
AZ_DECLARE_MODULE_CLASS(Gem_MotionMatching, EMotionFX::MotionMatching::MotionMatchingEditorModule)

@ -0,0 +1,60 @@
/*
* Copyright (c) Contributors to the Open 3D Engine Project.
* For complete copyright and license terms please see the LICENSE at the root of this distribution.
*
* SPDX-License-Identifier: Apache-2.0 OR MIT
*
*/
#include <AzCore/Serialization/SerializeContext.h>
#include <MotionMatchingEditorSystemComponent.h>
namespace EMotionFX::MotionMatching
{
void MotionMatchingEditorSystemComponent::Reflect(AZ::ReflectContext* context)
{
if (auto serializeContext = azrtti_cast<AZ::SerializeContext*>(context))
{
serializeContext->Class<MotionMatchingEditorSystemComponent, MotionMatchingSystemComponent>()
->Version(0);
}
}
MotionMatchingEditorSystemComponent::MotionMatchingEditorSystemComponent() = default;
MotionMatchingEditorSystemComponent::~MotionMatchingEditorSystemComponent() = default;
void MotionMatchingEditorSystemComponent::GetProvidedServices(AZ::ComponentDescriptor::DependencyArrayType& provided)
{
BaseSystemComponent::GetProvidedServices(provided);
provided.push_back(AZ_CRC_CE("MotionMatchingEditorService"));
}
void MotionMatchingEditorSystemComponent::GetIncompatibleServices(AZ::ComponentDescriptor::DependencyArrayType& incompatible)
{
BaseSystemComponent::GetIncompatibleServices(incompatible);
incompatible.push_back(AZ_CRC_CE("MotionMatchingEditorService"));
}
void MotionMatchingEditorSystemComponent::GetRequiredServices([[maybe_unused]] AZ::ComponentDescriptor::DependencyArrayType& required)
{
BaseSystemComponent::GetRequiredServices(required);
}
void MotionMatchingEditorSystemComponent::GetDependentServices([[maybe_unused]] AZ::ComponentDescriptor::DependencyArrayType& dependent)
{
BaseSystemComponent::GetDependentServices(dependent);
}
void MotionMatchingEditorSystemComponent::Activate()
{
MotionMatchingSystemComponent::Activate();
AzToolsFramework::EditorEvents::Bus::Handler::BusConnect();
}
void MotionMatchingEditorSystemComponent::Deactivate()
{
AzToolsFramework::EditorEvents::Bus::Handler::BusDisconnect();
MotionMatchingSystemComponent::Deactivate();
}
} // namespace EMotionFX::MotionMatching

@ -0,0 +1,40 @@
/*
* Copyright (c) Contributors to the Open 3D Engine Project.
* For complete copyright and license terms please see the LICENSE at the root of this distribution.
*
* SPDX-License-Identifier: Apache-2.0 OR MIT
*
*/
#pragma once
#include <MotionMatchingSystemComponent.h>
#include <AzToolsFramework/Entity/EditorEntityContextBus.h>
namespace EMotionFX::MotionMatching
{
/// System component for MotionMatching editor
class MotionMatchingEditorSystemComponent
: public MotionMatchingSystemComponent
, private AzToolsFramework::EditorEvents::Bus::Handler
{
using BaseSystemComponent = MotionMatchingSystemComponent;
public:
AZ_COMPONENT(MotionMatchingEditorSystemComponent, "{a43957d3-5a2d-4c29-873d-7daacc357722}", BaseSystemComponent);
static void Reflect(AZ::ReflectContext* context);
MotionMatchingEditorSystemComponent();
~MotionMatchingEditorSystemComponent();
private:
static void GetProvidedServices(AZ::ComponentDescriptor::DependencyArrayType& provided);
static void GetIncompatibleServices(AZ::ComponentDescriptor::DependencyArrayType& incompatible);
static void GetRequiredServices(AZ::ComponentDescriptor::DependencyArrayType& required);
static void GetDependentServices(AZ::ComponentDescriptor::DependencyArrayType& dependent);
// AZ::Component
void Activate() override;
void Deactivate() override;
};
} // namespace EMotionFX::MotionMatching

@ -0,0 +1,571 @@
/*
* Copyright (c) Contributors to the Open 3D Engine Project.
* For complete copyright and license terms please see the LICENSE at the root of this distribution.
*
* SPDX-License-Identifier: Apache-2.0 OR MIT
*
*/
#include <AzCore/Debug/Timer.h>
#include <AzCore/Component/ComponentApplicationBus.h>
#include <AzCore/Serialization/EditContext.h>
#include <AzCore/Serialization/SerializeContext.h>
#include <EMotionFX/Source/ActorInstance.h>
#include <Allocators.h>
#include <EMotionFX/Source/EMotionFXManager.h>
#include <EMotionFX/Source/Motion.h>
#include <EMotionFX/Source/MotionInstance.h>
#include <EMotionFX/Source/MotionInstancePool.h>
#include <MotionMatchingData.h>
#include <MotionMatchingInstance.h>
#include <Feature.h>
#include <FeatureSchema.h>
#include <FeatureTrajectory.h>
#include <KdTree.h>
#include <ImGuiMonitorBus.h>
#include <EMotionFX/Source/Pose.h>
#include <EMotionFX/Source/TransformData.h>
#include <PoseDataJointVelocities.h>
#include <EMotionFX/Tools/EMotionStudio/EMStudioSDK/Source/RenderPlugin/ViewportPluginBus.h>
namespace EMotionFX::MotionMatching
{
AZ_CLASS_ALLOCATOR_IMPL(MotionMatchingInstance, MotionMatchAllocator, 0)
MotionMatchingInstance::~MotionMatchingInstance()
{
if (m_motionInstance)
{
GetMotionInstancePool().Free(m_motionInstance);
}
if (m_prevMotionInstance)
{
GetMotionInstancePool().Free(m_prevMotionInstance);
}
}
MotionInstance* MotionMatchingInstance::CreateMotionInstance() const
{
MotionInstance* result = GetMotionInstancePool().RequestNew(m_data->GetFrameDatabase().GetFrame(0).GetSourceMotion(), m_actorInstance);
return result;
}
void MotionMatchingInstance::Init(const InitSettings& settings)
{
AZ_Assert(settings.m_actorInstance, "The actor instance cannot be a nullptr.");
AZ_Assert(settings.m_data, "The motion match data cannot be nullptr.");
// Update the cached pointer to the trajectory feature.
const FeatureSchema& featureSchema = settings.m_data->GetFeatureSchema();
for (Feature* feature : featureSchema.GetFeatures())
{
if (feature->RTTI_GetType() == azrtti_typeid<FeatureTrajectory>())
{
m_cachedTrajectoryFeature = static_cast<FeatureTrajectory*>(feature);
break;
}
}
// Debug display initialization.
const auto AddDebugDisplay = [=](AZ::s32 debugDisplayId)
{
if (debugDisplayId == -1)
{
return;
}
AzFramework::DebugDisplayRequestBus::BusPtr debugDisplayBus;
AzFramework::DebugDisplayRequestBus::Bind(debugDisplayBus, debugDisplayId);
AzFramework::DebugDisplayRequests* debugDisplay = AzFramework::DebugDisplayRequestBus::FindFirstHandler(debugDisplayBus);
if (debugDisplay)
{
m_debugDisplays.emplace_back(debugDisplay);
}
};
// Draw the debug visualizations to the Animation Editor as well as the LY Editor viewport.
AZ::s32 animationEditorViewportId = -1;
EMStudio::ViewportPluginRequestBus::BroadcastResult(animationEditorViewportId, &EMStudio::ViewportPluginRequestBus::Events::GetViewportId);
AddDebugDisplay(animationEditorViewportId);
AddDebugDisplay(AzFramework::g_defaultSceneEntityDebugDisplayId);
m_actorInstance = settings.m_actorInstance;
m_data = settings.m_data;
if (settings.m_data->GetFrameDatabase().GetNumFrames() == 0)
{
return;
}
if (!m_motionInstance)
{
m_motionInstance = CreateMotionInstance();
}
if (!m_prevMotionInstance)
{
m_prevMotionInstance = CreateMotionInstance();
}
m_blendSourcePose.LinkToActorInstance(m_actorInstance);
m_blendSourcePose.InitFromBindPose(m_actorInstance);
m_blendTargetPose.LinkToActorInstance(m_actorInstance);
m_blendTargetPose.InitFromBindPose(m_actorInstance);
m_queryPose.LinkToActorInstance(m_actorInstance);
m_queryPose.InitFromBindPose(m_actorInstance);
// Make sure we have enough space inside the frame floats array, which is used to search the kdTree.
const size_t numValuesInKdTree = m_data->GetKdTree().GetNumDimensions();
m_queryFeatureValues.resize(numValuesInKdTree);
// Initialize the trajectory history.
size_t rootJointIndex = m_actorInstance->GetActor()->GetMotionExtractionNodeIndex();
if (rootJointIndex == InvalidIndex32)
{
rootJointIndex = 0;
}
m_trajectoryHistory.Init(*m_actorInstance->GetTransformData()->GetCurrentPose(),
rootJointIndex,
m_cachedTrajectoryFeature->GetFacingAxisDir(),
m_trajectorySecsToTrack);
}
void MotionMatchingInstance::DebugDraw()
{
if (m_data && !m_debugDisplays.empty())
{
for (AzFramework::DebugDisplayRequests* debugDisplay : m_debugDisplays)
{
if (debugDisplay)
{
const AZ::u32 prevState = debugDisplay->GetState();
DebugDraw(*debugDisplay);
debugDisplay->SetState(prevState);
}
}
}
}
void MotionMatchingInstance::DebugDraw(AzFramework::DebugDisplayRequests& debugDisplay)
{
AZ_PROFILE_SCOPE(Animation, "MotionMatchingInstance::DebugDraw");
// Get the lowest cost frame index from the last search. As we're searching the feature database with a much lower
// frequency and sample the animation onwards from this, the resulting frame index does not represent the current
// feature values from the shown pose.
const size_t curFrameIndex = GetLowestCostFrameIndex();
if (curFrameIndex == InvalidIndex)
{
return;
}
const FrameDatabase& frameDatabase = m_data->GetFrameDatabase();
const FeatureSchema& featureSchema = m_data->GetFeatureSchema();
// Find the frame index in the frame database that belongs to the currently used pose.
const size_t currentFrame = frameDatabase.FindFrameIndex(m_motionInstance->GetMotion(), m_motionInstance->GetCurrentTime());
// Render the feature debug visualizations for the current frame.
if (currentFrame != InvalidIndex)
{
for (Feature* feature: featureSchema.GetFeatures())
{
if (feature->GetDebugDrawEnabled())
{
feature->DebugDraw(debugDisplay, this, currentFrame);
}
}
}
// Draw the desired future trajectory and the sampled version of the past trajectory.
const AZ::Color trajectoryQueryColor = AZ::Color::CreateFromRgba(90,219,64,255);
m_trajectoryQuery.DebugDraw(debugDisplay, trajectoryQueryColor);
// Draw the trajectory history starting after the sampled version of the past trajectory.
m_trajectoryHistory.DebugDraw(debugDisplay, trajectoryQueryColor, m_cachedTrajectoryFeature->GetPastTimeRange());
}
void MotionMatchingInstance::SamplePose(MotionInstance* motionInstance, Pose& outputPose)
{
const Pose* bindPose = m_actorInstance->GetTransformData()->GetBindPose();
motionInstance->GetMotion()->Update(bindPose, &outputPose, motionInstance);
if (m_actorInstance->GetActor()->GetMotionExtractionNode() && m_actorInstance->GetMotionExtractionEnabled())
{
outputPose.CompensateForMotionExtraction();
}
}
void MotionMatchingInstance::SamplePose(Motion* motion, Pose& outputPose, float sampleTime) const
{
MotionDataSampleSettings sampleSettings;
sampleSettings.m_actorInstance = outputPose.GetActorInstance();
sampleSettings.m_inPlace = false;
sampleSettings.m_mirror = false;
sampleSettings.m_retarget = false;
sampleSettings.m_inputPose = sampleSettings.m_actorInstance->GetTransformData()->GetBindPose();
sampleSettings.m_sampleTime = sampleTime;
sampleSettings.m_sampleTime = AZ::GetClamp(sampleTime, 0.0f, motion->GetDuration());
motion->SamplePose(&outputPose, sampleSettings);
}
void MotionMatchingInstance::PostUpdate([[maybe_unused]] float timeDelta)
{
if (!m_data)
{
m_motionExtractionDelta.Identity();
return;
}
const size_t lowestCostFrame = GetLowestCostFrameIndex();
if (m_data->GetFrameDatabase().GetNumFrames() == 0 || lowestCostFrame == InvalidIndex)
{
m_motionExtractionDelta.Identity();
return;
}
// Blend the motion extraction deltas.
// Note: Make sure to update the previous as well as the current/target motion instances.
if (m_blendWeight >= 1.0f - AZ::Constants::FloatEpsilon)
{
m_motionInstance->ExtractMotion(m_motionExtractionDelta);
}
else if (m_blendWeight > AZ::Constants::FloatEpsilon && m_blendWeight < 1.0f - AZ::Constants::FloatEpsilon)
{
Transform targetMotionExtractionDelta;
m_motionInstance->ExtractMotion(m_motionExtractionDelta);
m_prevMotionInstance->ExtractMotion(targetMotionExtractionDelta);
m_motionExtractionDelta.Blend(targetMotionExtractionDelta, m_blendWeight);
}
else
{
m_prevMotionInstance->ExtractMotion(m_motionExtractionDelta);
}
}
void MotionMatchingInstance::Output(Pose& outputPose)
{
AZ_PROFILE_SCOPE(Animation, "MotionMatchingInstance::Output");
if (!m_data)
{
outputPose.InitFromBindPose(m_actorInstance);
return;
}
const size_t lowestCostFrame = GetLowestCostFrameIndex();
if (m_data->GetFrameDatabase().GetNumFrames() == 0 || lowestCostFrame == InvalidIndex)
{
outputPose.InitFromBindPose(m_actorInstance);
return;
}
// Sample the motions and blend the results when needed.
if (m_blendWeight >= 1.0f - AZ::Constants::FloatEpsilon)
{
m_blendTargetPose.InitFromBindPose(m_actorInstance);
if (m_motionInstance)
{
SamplePose(m_motionInstance, m_blendTargetPose);
}
outputPose = m_blendTargetPose;
}
else if (m_blendWeight > AZ::Constants::FloatEpsilon && m_blendWeight < 1.0f - AZ::Constants::FloatEpsilon)
{
m_blendSourcePose.InitFromBindPose(m_actorInstance);
m_blendTargetPose.InitFromBindPose(m_actorInstance);
if (m_motionInstance)
{
SamplePose(m_motionInstance, m_blendTargetPose);
}
if (m_prevMotionInstance)
{
SamplePose(m_prevMotionInstance, m_blendSourcePose);
}
outputPose = m_blendSourcePose;
outputPose.Blend(&m_blendTargetPose, m_blendWeight);
}
else
{
m_blendSourcePose.InitFromBindPose(m_actorInstance);
if (m_prevMotionInstance)
{
SamplePose(m_prevMotionInstance, m_blendSourcePose);
}
outputPose = m_blendSourcePose;
}
}
void MotionMatchingInstance::Update(float timePassedInSeconds, const AZ::Vector3& targetPos, const AZ::Vector3& targetFacingDir, TrajectoryQuery::EMode mode, float pathRadius, float pathSpeed)
{
AZ_PROFILE_SCOPE(Animation, "MotionMatchingInstance::Update");
if (!m_data)
{
return;
}
size_t currentFrameIndex = GetLowestCostFrameIndex();
if (currentFrameIndex == InvalidIndex)
{
currentFrameIndex = 0;
}
// Add the sample from the last frame (post-motion extraction)
m_trajectoryHistory.AddSample(*m_actorInstance->GetTransformData()->GetCurrentPose());
// Update the time. After this there is no sample for the updated time in the history as we're about to prepare this with the current update.
m_trajectoryHistory.Update(timePassedInSeconds);
// Register the current actor instance position to the history data of the spline.
m_trajectoryQuery.Update(m_actorInstance,
m_cachedTrajectoryFeature,
m_trajectoryHistory,
mode,
targetPos,
targetFacingDir,
timePassedInSeconds,
pathRadius,
pathSpeed);
// Calculate the new time value of the motion, but don't set it yet (the syncing might adjust this again)
m_motionInstance->SetFreezeAtLastFrame(true);
m_motionInstance->SetMaxLoops(1);
const float newMotionTime = m_motionInstance->CalcPlayStateAfterUpdate(timePassedInSeconds).m_currentTime;
m_newMotionTime = newMotionTime;
// Keep on playing the previous instance as we're blending the poses and motion extraction deltas.
m_prevMotionInstance->Update(timePassedInSeconds);
m_timeSinceLastFrameSwitch += timePassedInSeconds;
const float lowestCostSearchTimeInterval = 1.0f / m_lowestCostSearchFrequency;
if (m_blending)
{
const float maxBlendTime = lowestCostSearchTimeInterval;
m_blendProgressTime += timePassedInSeconds;
if (m_blendProgressTime > maxBlendTime)
{
m_blendWeight = 1.0f;
m_blendProgressTime = maxBlendTime;
m_blending = false;
}
else
{
m_blendWeight = AZ::GetClamp(m_blendProgressTime / maxBlendTime, 0.0f, 1.0f);
}
}
const bool searchLowestCostFrame = m_timeSinceLastFrameSwitch >= lowestCostSearchTimeInterval;
if (searchLowestCostFrame)
{
// Calculate the input query pose for the motion matching search algorithm.
{
// Sample the pose for the new motion time as the motion instance has not been updated with the timeDelta from this frame yet.
SamplePose(m_motionInstance->GetMotion(), m_queryPose, newMotionTime);
// Copy over the motion extraction joint transform from the current pose to the newly sampled pose.
// When sampling a motion, the motion extraction joint is in animation space, while we need the query pose to be in
// world space.
// Note: This does not yet take the extraction delta from the current tick into account.
if (m_actorInstance->GetActor()->GetMotionExtractionNode())
{
const Pose* currentPose = m_actorInstance->GetTransformData()->GetCurrentPose();
const size_t motionExtractionJointIndex = m_actorInstance->GetActor()->GetMotionExtractionNodeIndex();
m_queryPose.SetWorldSpaceTransform(motionExtractionJointIndex,
currentPose->GetWorldSpaceTransform(motionExtractionJointIndex));
}
// Calculate the joint velocities for the sampled pose using the same method as we do for the frame database.
PoseDataJointVelocities* velocityPoseData = m_queryPose.GetAndPreparePoseData<PoseDataJointVelocities>(m_actorInstance);
velocityPoseData->CalculateVelocity(m_motionInstance, m_cachedTrajectoryFeature->GetRelativeToNodeIndex());
}
const FeatureMatrix& featureMatrix = m_data->GetFeatureMatrix();
const FrameDatabase& frameDatabase = m_data->GetFrameDatabase();
Feature::FrameCostContext frameCostContext(featureMatrix, m_queryPose);
frameCostContext.m_trajectoryQuery = &m_trajectoryQuery;
frameCostContext.m_actorInstance = m_actorInstance;
const size_t lowestCostFrameIndex = FindLowestCostFrameIndex(frameCostContext);
const Frame& currentFrame = frameDatabase.GetFrame(currentFrameIndex);
const Frame& lowestCostFrame = frameDatabase.GetFrame(lowestCostFrameIndex);
const bool sameMotion = (currentFrame.GetSourceMotion() == lowestCostFrame.GetSourceMotion());
const float timeBetweenFrames = newMotionTime - lowestCostFrame.GetSampleTime();
const bool sameLocation = sameMotion && (AZ::GetAbs(timeBetweenFrames) < 0.1f);
if (lowestCostFrameIndex != currentFrameIndex && !sameLocation)
{
// Start a blend.
m_blending = true;
m_blendWeight = 0.0f;
m_blendProgressTime = 0.0f;
// Store the current motion instance state, so we can sample this as source pose.
m_prevMotionInstance->SetMotion(m_motionInstance->GetMotion());
m_prevMotionInstance->SetMirrorMotion(m_motionInstance->GetMirrorMotion());
m_prevMotionInstance->SetCurrentTime(newMotionTime, true);
m_prevMotionInstance->SetLastCurrentTime(m_prevMotionInstance->GetCurrentTime() - timePassedInSeconds);
m_lowestCostFrameIndex = lowestCostFrameIndex;
m_motionInstance->SetMotion(lowestCostFrame.GetSourceMotion());
m_motionInstance->SetMirrorMotion(lowestCostFrame.GetMirrored());
// The new motion time will become the current time after this frame while the current time
// becomes the last current time. As we just start playing at the search frame, calculate
// the last time based on the time delta.
m_motionInstance->SetCurrentTime(lowestCostFrame.GetSampleTime() - timePassedInSeconds, true);
m_newMotionTime = lowestCostFrame.GetSampleTime();
}
// Do this always, else wise we search for the lowest cost frame index too many times.
m_timeSinceLastFrameSwitch = 0.0f;
}
// ImGui monitor
{
#ifdef IMGUI_ENABLED
const KdTree& kdTree = m_data->GetKdTree();
ImGuiMonitorRequestBus::Broadcast(&ImGuiMonitorRequests::SetKdTreeMemoryUsage, kdTree.CalcMemoryUsageInBytes());
ImGuiMonitorRequestBus::Broadcast(&ImGuiMonitorRequests::SetKdTreeNumNodes, kdTree.GetNumNodes());
ImGuiMonitorRequestBus::Broadcast(&ImGuiMonitorRequests::SetKdTreeNumDimensions, kdTree.GetNumDimensions());
// TODO: add memory usage for frame database
const FeatureMatrix& featureMatrix = m_data->GetFeatureMatrix();
ImGuiMonitorRequestBus::Broadcast(&ImGuiMonitorRequests::SetFeatureMatrixMemoryUsage, featureMatrix.CalcMemoryUsageInBytes());
ImGuiMonitorRequestBus::Broadcast(&ImGuiMonitorRequests::SetFeatureMatrixNumFrames, featureMatrix.rows());
ImGuiMonitorRequestBus::Broadcast(&ImGuiMonitorRequests::SetFeatureMatrixNumComponents, featureMatrix.cols());
#endif
}
}
size_t MotionMatchingInstance::FindLowestCostFrameIndex(const Feature::FrameCostContext& context)
{
AZ::Debug::Timer timer;
timer.Stamp();
AZ_PROFILE_SCOPE(Animation, "MotionMatchingInstance::FindLowestCostFrameIndex");
const FrameDatabase& frameDatabase = m_data->GetFrameDatabase();
const FeatureSchema& featureSchema = m_data->GetFeatureSchema();
const FeatureTrajectory* trajectoryFeature = m_cachedTrajectoryFeature;
// 1. Broad-phase search using KD-tree
{
// Build the input query features that will be compared to every entry in the feature database in the motion matching search.
size_t startOffset = 0;
for (Feature* feature : m_data->GetFeaturesInKdTree())
{
feature->FillQueryFeatureValues(startOffset, m_queryFeatureValues, context);
startOffset += feature->GetNumDimensions();
}
AZ_Assert(startOffset == m_queryFeatureValues.size(), "Frame float vector is not the expected size.");
// Find our nearest frames.
m_data->GetKdTree().FindNearestNeighbors(m_queryFeatureValues, m_nearestFrames);
}
// 2. Narrow-phase, brute force find the actual best matching frame (frame with the minimal cost).
float minCost = FLT_MAX;
size_t minCostFrameIndex = 0;
m_tempCosts.resize(featureSchema.GetNumFeatures());
m_minCosts.resize(featureSchema.GetNumFeatures());
float minTrajectoryPastCost = 0.0f;
float minTrajectoryFutureCost = 0.0f;
// Iterate through the frames filtered by the broad-phase search.
for (const size_t frameIndex : m_nearestFrames)
{
const Frame& frame = frameDatabase.GetFrame(frameIndex);
// TODO: This shouldn't be there, we should be discarding the frames when extracting the features and not at runtime when checking the cost.
if (frame.GetSampleTime() >= frame.GetSourceMotion()->GetDuration() - 1.0f)
{
continue;
}
float frameCost = 0.0f;
// Calculate the frame cost by accumulating the weighted feature costs.
for (size_t featureIndex = 0; featureIndex < featureSchema.GetNumFeatures(); ++featureIndex)
{
Feature* feature = featureSchema.GetFeature(featureIndex);
if (feature->RTTI_GetType() != azrtti_typeid<FeatureTrajectory>())
{
const float featureCost = feature->CalculateFrameCost(frameIndex, context);
const float featureCostFactor = feature->GetCostFactor();
const float featureFinalCost = featureCost * featureCostFactor;
frameCost += featureFinalCost;
m_tempCosts[featureIndex] = featureFinalCost;
}
}
// Manually add the trajectory cost.
float trajectoryPastCost = 0.0f;
float trajectoryFutureCost = 0.0f;
if (trajectoryFeature)
{
trajectoryPastCost = trajectoryFeature->CalculatePastFrameCost(frameIndex, context) * trajectoryFeature->GetPastCostFactor();
trajectoryFutureCost = trajectoryFeature->CalculateFutureFrameCost(frameIndex, context) * trajectoryFeature->GetFutureCostFactor();
frameCost += trajectoryPastCost;
frameCost += trajectoryFutureCost;
}
// Track the minimum feature and frame costs.
if (frameCost < minCost)
{
minCost = frameCost;
minCostFrameIndex = frameIndex;
for (size_t featureIndex = 0; featureIndex < featureSchema.GetNumFeatures(); ++featureIndex)
{
Feature* feature = featureSchema.GetFeature(featureIndex);
if (feature->RTTI_GetType() != azrtti_typeid<FeatureTrajectory>())
{
m_minCosts[featureIndex] = m_tempCosts[featureIndex];
}
}
minTrajectoryPastCost = trajectoryPastCost;
minTrajectoryFutureCost = trajectoryFutureCost;
}
}
// 3. ImGui debug visualization
{
const float time = timer.GetDeltaTimeInSeconds();
ImGuiMonitorRequestBus::Broadcast(&ImGuiMonitorRequests::PushPerformanceHistogramValue, "FindLowestCostFrameIndex", time * 1000.0f);
for (size_t featureIndex = 0; featureIndex < featureSchema.GetNumFeatures(); ++featureIndex)
{
Feature* feature = featureSchema.GetFeature(featureIndex);
if (feature->RTTI_GetType() != azrtti_typeid<FeatureTrajectory>())
{
ImGuiMonitorRequestBus::Broadcast(&ImGuiMonitorRequests::PushCostHistogramValue,
feature->GetName().c_str(),
m_minCosts[featureIndex],
feature->GetDebugDrawColor());
}
}
if (trajectoryFeature)
{
ImGuiMonitorRequestBus::Broadcast(&ImGuiMonitorRequests::PushCostHistogramValue, "Future Trajectory", minTrajectoryFutureCost, trajectoryFeature->GetDebugDrawColor());
ImGuiMonitorRequestBus::Broadcast(&ImGuiMonitorRequests::PushCostHistogramValue, "Past Trajectory", minTrajectoryPastCost, trajectoryFeature->GetDebugDrawColor());
}
ImGuiMonitorRequestBus::Broadcast(&ImGuiMonitorRequests::PushCostHistogramValue, "Total Cost", minCost, AZ::Color::CreateFromRgba(202,255,191,255));
}
return minCostFrameIndex;
}
} // namespace EMotionFX::MotionMatching

@ -0,0 +1,116 @@
/*
* Copyright (c) Contributors to the Open 3D Engine Project.
* For complete copyright and license terms please see the LICENSE at the root of this distribution.
*
* SPDX-License-Identifier: Apache-2.0 OR MIT
*
*/
#pragma once
#include <AzCore/Math/Vector3.h>
#include <AzCore/Memory/Memory.h>
#include <AzCore/RTTI/RTTI.h>
#include <AzFramework/Entity/EntityDebugDisplayBus.h>
#include <EMotionFX/Source/EMotionFXConfig.h>
#include <Feature.h>
#include <TrajectoryHistory.h>
#include <TrajectoryQuery.h>
namespace AZ
{
class ReflectContext;
}
namespace EMotionFX
{
class ActorInstance;
class Motion;
}
namespace EMotionFX::MotionMatching
{
class MotionMatchingData;
class EMFX_API MotionMatchingInstance
{
public:
AZ_RTTI(MotionMatchingInstance, "{1ED03AD8-0FB2-431B-AF01-02F7E930EB73}")
AZ_CLASS_ALLOCATOR_DECL
virtual ~MotionMatchingInstance();
struct EMFX_API InitSettings
{
ActorInstance* m_actorInstance = nullptr;
MotionMatchingData* m_data = nullptr;
};
void Init(const InitSettings& settings);
void DebugDraw();
void DebugDraw(AzFramework::DebugDisplayRequests& debugDisplay);
void Update(float timePassedInSeconds, const AZ::Vector3& targetPos, const AZ::Vector3& targetFacingDir, TrajectoryQuery::EMode mode, float pathRadius, float pathSpeed);
void PostUpdate(float timeDelta);
void Output(Pose& outputPose);
MotionInstance* GetMotionInstance() const { return m_motionInstance; }
ActorInstance* GetActorInstance() const { return m_actorInstance; }
MotionMatchingData* GetData() const { return m_data; }
size_t GetLowestCostFrameIndex() const { return m_lowestCostFrameIndex; }
void SetLowestCostSearchFrequency(float frequency) { m_lowestCostSearchFrequency = frequency; }
float GetNewMotionTime() const { return m_newMotionTime; }
/**
* Get the cached trajectory feature.
* The trajectory feature is searched in the feature schema used in the current instance at init time.
*/
FeatureTrajectory* GetTrajectoryFeature() const { return m_cachedTrajectoryFeature; }
const TrajectoryQuery& GetTrajectoryQuery() const { return m_trajectoryQuery; }
const TrajectoryHistory& GetTrajectoryHistory() const { return m_trajectoryHistory; }
const Transform& GetMotionExtractionDelta() const { return m_motionExtractionDelta; }
private:
MotionInstance* CreateMotionInstance() const;
void SamplePose(MotionInstance* motionInstance, Pose& outputPose);
void SamplePose(Motion* motion, Pose& outputPose, float sampleTime) const;
size_t FindLowestCostFrameIndex(const Feature::FrameCostContext& context);
MotionMatchingData* m_data = nullptr;
ActorInstance* m_actorInstance = nullptr;
Pose m_blendSourcePose;
Pose m_blendTargetPose;
Pose m_queryPose; //! Input query pose for the motion matching search.
MotionInstance* m_motionInstance = nullptr;
MotionInstance* m_prevMotionInstance = nullptr;
Transform m_motionExtractionDelta = Transform::CreateIdentity();
/// Buffers used for the broad-phase KD-tree search.
AZStd::vector<float> m_queryFeatureValues; /** The input query features to be compared to every entry/row in the feature matrix with the motion matching search. */
AZStd::vector<size_t> m_nearestFrames; /** Stores the nearest matching frames / search result from the KD-tree. */
FeatureTrajectory* m_cachedTrajectoryFeature = nullptr; /** Cached pointer to the trajectory feature in the feature schema. */
TrajectoryQuery m_trajectoryQuery;
TrajectoryHistory m_trajectoryHistory;
static constexpr float m_trajectorySecsToTrack = 5.0f;
float m_timeSinceLastFrameSwitch = 0.0f;
float m_newMotionTime = 0.0f;
size_t m_lowestCostFrameIndex = InvalidIndex;
float m_lowestCostSearchFrequency = 5.0f; //< How often the lowest cost frame shall be searched per second.
bool m_blending = false;
float m_blendWeight = 1.0f;
float m_blendProgressTime = 0.0f; // How long are we already blending? In seconds.
/// Buffers used for FindLowestCostFrameIndex().
AZStd::vector<float> m_tempCosts;
AZStd::vector<float> m_minCosts;
AZStd::vector<AzFramework::DebugDisplayRequests*> m_debugDisplays;
};
} // namespace EMotionFX::MotionMatching

@ -0,0 +1,23 @@
/*
* Copyright (c) Contributors to the Open 3D Engine Project.
* For complete copyright and license terms please see the LICENSE at the root of this distribution.
*
* SPDX-License-Identifier: Apache-2.0 OR MIT
*
*/
#include <MotionMatchingModuleInterface.h>
#include <MotionMatchingSystemComponent.h>
namespace EMotionFX::MotionMatching
{
class MotionMatchingModule
: public MotionMatchingModuleInterface
{
public:
AZ_RTTI(MotionMatchingModule, "{cf4381d1-0207-4ef8-85f0-6c88ec28a7b6}", MotionMatchingModuleInterface);
AZ_CLASS_ALLOCATOR(MotionMatchingModule, AZ::SystemAllocator, 0);
};
}// namespace EMotionFX::MotionMatching
AZ_DECLARE_MODULE_CLASS(Gem_MotionMatching, EMotionFX::MotionMatching::MotionMatchingModule)

@ -0,0 +1,39 @@
/*
* Copyright (c) Contributors to the Open 3D Engine Project.
* For complete copyright and license terms please see the LICENSE at the root of this distribution.
*
* SPDX-License-Identifier: Apache-2.0 OR MIT
*
*/
#include <AzCore/Memory/SystemAllocator.h>
#include <AzCore/Module/Module.h>
#include <MotionMatchingSystemComponent.h>
namespace EMotionFX::MotionMatching
{
class MotionMatchingModuleInterface
: public AZ::Module
{
public:
AZ_RTTI(MotionMatchingModuleInterface, "{33e8e826-b143-4008-89f3-9a46ad3de4fe}", AZ::Module);
AZ_CLASS_ALLOCATOR(MotionMatchingModuleInterface, AZ::SystemAllocator, 0);
MotionMatchingModuleInterface()
{
m_descriptors.insert(m_descriptors.end(),
{
MotionMatchingSystemComponent::CreateDescriptor(),
});
}
/// Add required SystemComponents to the SystemEntity.
AZ::ComponentTypeList GetRequiredSystemComponents() const override
{
return AZ::ComponentTypeList
{
azrtti_typeid<MotionMatchingSystemComponent>(),
};
}
};
}// namespace EMotionFX::MotionMatching

@ -0,0 +1,128 @@
/*
* Copyright (c) Contributors to the Open 3D Engine Project.
* For complete copyright and license terms please see the LICENSE at the root of this distribution.
*
* SPDX-License-Identifier: Apache-2.0 OR MIT
*
*/
#include <AzCore/Serialization/SerializeContext.h>
#include <AzCore/Serialization/EditContext.h>
#include <AzCore/Serialization/EditContextConstants.inl>
#include <EMotionFX/Source/AnimGraphObjectFactory.h>
#include <EMotionFX/Source/EMotionFXManager.h>
#include <EMotionFX/Source/PoseDataFactory.h>
#include <Integration/EMotionFXBus.h>
#include <BlendTreeMotionMatchNode.h>
#include <Feature.h>
#include <FeaturePosition.h>
#include <FeatureTrajectory.h>
#include <FeatureVelocity.h>
#include <EventData.h>
#include <MotionMatchingSystemComponent.h>
#include <PoseDataJointVelocities.h>
namespace EMotionFX::MotionMatching
{
void MotionMatchingSystemComponent::Reflect(AZ::ReflectContext* context)
{
if (AZ::SerializeContext* serialize = azrtti_cast<AZ::SerializeContext*>(context))
{
serialize->Class<MotionMatchingSystemComponent, AZ::Component>()
->Version(0)
;
if (AZ::EditContext* ec = serialize->GetEditContext())
{
ec->Class<MotionMatchingSystemComponent>("MotionMatching", "[Description of functionality provided by this System Component]")
->ClassElement(AZ::Edit::ClassElements::EditorData, "")
->Attribute(AZ::Edit::Attributes::AppearsInAddComponentMenu, AZ_CRC("System"))
->Attribute(AZ::Edit::Attributes::AutoExpand, true)
;
}
}
EMotionFX::MotionMatching::DiscardFrameEventData::Reflect(context);
EMotionFX::MotionMatching::TagEventData::Reflect(context);
EMotionFX::MotionMatching::FeatureSchema::Reflect(context);
EMotionFX::MotionMatching::Feature::Reflect(context);
EMotionFX::MotionMatching::FeaturePosition::Reflect(context);
EMotionFX::MotionMatching::FeatureTrajectory::Reflect(context);
EMotionFX::MotionMatching::FeatureVelocity::Reflect(context);
EMotionFX::MotionMatching::PoseDataJointVelocities::Reflect(context);
EMotionFX::MotionMatching::BlendTreeMotionMatchNode::Reflect(context);
}
void MotionMatchingSystemComponent::GetProvidedServices(AZ::ComponentDescriptor::DependencyArrayType& provided)
{
provided.push_back(AZ_CRC_CE("MotionMatchingService"));
}
void MotionMatchingSystemComponent::GetIncompatibleServices(AZ::ComponentDescriptor::DependencyArrayType& incompatible)
{
incompatible.push_back(AZ_CRC_CE("MotionMatchingService"));
}
void MotionMatchingSystemComponent::GetRequiredServices([[maybe_unused]] AZ::ComponentDescriptor::DependencyArrayType& required)
{
required.push_back(AZ_CRC("EMotionFXAnimationService", 0x3f8a6369));
}
void MotionMatchingSystemComponent::GetDependentServices([[maybe_unused]] AZ::ComponentDescriptor::DependencyArrayType& dependent)
{
}
MotionMatchingSystemComponent::MotionMatchingSystemComponent()
{
if (MotionMatchingInterface::Get() == nullptr)
{
MotionMatchingInterface::Register(this);
}
}
MotionMatchingSystemComponent::~MotionMatchingSystemComponent()
{
if (MotionMatchingInterface::Get() == this)
{
MotionMatchingInterface::Unregister(this);
}
}
void MotionMatchingSystemComponent::Init()
{
}
void MotionMatchingSystemComponent::Activate()
{
MotionMatchingRequestBus::Handler::BusConnect();
AZ::TickBus::Handler::BusConnect();
// Register the motion matching anim graph node
EMotionFX::AnimGraphObject* motionMatchNodeObject = EMotionFX::AnimGraphObjectFactory::Create(azrtti_typeid<EMotionFX::MotionMatching::BlendTreeMotionMatchNode>());
auto motionMatchNode = azdynamic_cast<EMotionFX::MotionMatching::BlendTreeMotionMatchNode*>(motionMatchNodeObject);
if (motionMatchNode)
{
EMotionFX::Integration::EMotionFXRequestBus::Broadcast(&EMotionFX::Integration::EMotionFXRequests::RegisterAnimGraphObjectType, motionMatchNode);
delete motionMatchNode;
}
// Register the joint velocities pose data.
EMotionFX::GetPoseDataFactory().AddPoseDataType(azrtti_typeid<EMotionFX::MotionMatching::PoseDataJointVelocities>());
}
void MotionMatchingSystemComponent::Deactivate()
{
AZ::TickBus::Handler::BusDisconnect();
MotionMatchingRequestBus::Handler::BusDisconnect();
}
void MotionMatchingSystemComponent::OnTick([[maybe_unused]] float deltaTime, [[maybe_unused]] AZ::ScriptTimePoint time)
{
}
} // namespace EMotionFX::MotionMatching

@ -0,0 +1,51 @@
/*
* Copyright (c) Contributors to the Open 3D Engine Project.
* For complete copyright and license terms please see the LICENSE at the root of this distribution.
*
* SPDX-License-Identifier: Apache-2.0 OR MIT
*
*/
#pragma once
#include <AzCore/Component/Component.h>
#include <AzCore/Component/TickBus.h>
#include <MotionMatching/MotionMatchingBus.h>
namespace EMotionFX::MotionMatching
{
class MotionMatchingSystemComponent
: public AZ::Component
, protected MotionMatchingRequestBus::Handler
, public AZ::TickBus::Handler
{
public:
AZ_COMPONENT(MotionMatchingSystemComponent, "{158cd35c-b548-4d7b-9493-9a3c5c359e49}");
static void Reflect(AZ::ReflectContext* context);
static void GetProvidedServices(AZ::ComponentDescriptor::DependencyArrayType& provided);
static void GetIncompatibleServices(AZ::ComponentDescriptor::DependencyArrayType& incompatible);
static void GetRequiredServices(AZ::ComponentDescriptor::DependencyArrayType& required);
static void GetDependentServices(AZ::ComponentDescriptor::DependencyArrayType& dependent);
MotionMatchingSystemComponent();
~MotionMatchingSystemComponent();
protected:
////////////////////////////////////////////////////////////////////////
// MotionMatchingRequestBus interface implementation
////////////////////////////////////////////////////////////////////////
// AZ::Component interface implementation
void Init() override;
void Activate() override;
void Deactivate() override;
////////////////////////////////////////////////////////////////////////
////////////////////////////////////////////////////////////////////////
// AZTickBus interface implementation
void OnTick(float deltaTime, AZ::ScriptTimePoint time) override;
////////////////////////////////////////////////////////////////////////
};
} // namespace EMotionFX::MotionMatching

@ -0,0 +1,160 @@
/*
* Copyright (c) Contributors to the Open 3D Engine Project.
* For complete copyright and license terms please see the LICENSE at the root of this distribution.
*
* SPDX-License-Identifier: Apache-2.0 OR MIT
*
*/
#include <EMotionFX/Source/ActorInstance.h>
#include <EMotionFX/Source/MotionInstance.h>
#include <EMotionFX/Source/Velocity.h>
#include <Allocators.h>
#include <Feature.h>
#include <PoseDataJointVelocities.h>
namespace EMotionFX::MotionMatching
{
AZ_CLASS_ALLOCATOR_IMPL(PoseDataJointVelocities, MotionMatchAllocator, 0)
PoseDataJointVelocities::PoseDataJointVelocities()
: PoseData()
{
}
PoseDataJointVelocities::~PoseDataJointVelocities()
{
Clear();
}
void PoseDataJointVelocities::Clear()
{
m_velocities.clear();
m_angularVelocities.clear();
}
void PoseDataJointVelocities::LinkToActorInstance(const ActorInstance* actorInstance)
{
m_velocities.resize(actorInstance->GetNumNodes());
m_angularVelocities.resize(actorInstance->GetNumNodes());
SetRelativeToJointIndex(actorInstance->GetActor()->GetMotionExtractionNodeIndex());
}
void PoseDataJointVelocities::SetRelativeToJointIndex(size_t relativeToJointIndex)
{
if (relativeToJointIndex == InvalidIndex)
{
m_relativeToJointIndex = 0;
}
else
{
m_relativeToJointIndex = relativeToJointIndex;
}
}
void PoseDataJointVelocities::LinkToActor(const Actor* actor)
{
AZ_UNUSED(actor);
Clear();
}
void PoseDataJointVelocities::Reset()
{
const size_t numJoints = m_velocities.size();
for (size_t i = 0; i < numJoints; ++i)
{
m_velocities[i] = AZ::Vector3::CreateZero();
m_angularVelocities[i] = AZ::Vector3::CreateZero();
}
}
void PoseDataJointVelocities::CopyFrom(const PoseData* from)
{
AZ_Assert(from->RTTI_GetType() == azrtti_typeid<PoseDataJointVelocities>(), "Cannot copy from pose data other than joint velocity pose data.");
const PoseDataJointVelocities* fromVelocityPoseData = static_cast<const PoseDataJointVelocities*>(from);
m_isUsed = fromVelocityPoseData->m_isUsed;
m_velocities = fromVelocityPoseData->m_velocities;
m_angularVelocities = fromVelocityPoseData->m_angularVelocities;
m_relativeToJointIndex = fromVelocityPoseData->m_relativeToJointIndex;
}
void PoseDataJointVelocities::Blend(const Pose* destPose, float weight)
{
PoseDataJointVelocities* destPoseData = destPose->GetPoseData<PoseDataJointVelocities>();
if (destPoseData && destPoseData->IsUsed())
{
AZ_Assert(m_velocities.size() == destPoseData->m_velocities.size(), "Expected the same number of joints and velocities in the destination pose data.");
if (m_isUsed)
{
// Blend while both, the destination pose as well as the current pose hold joint velocities.
for (size_t i = 0; i < m_velocities.size(); ++i)
{
m_velocities[i] = m_velocities[i].Lerp(destPoseData->m_velocities[i], weight);
m_angularVelocities[i] = m_angularVelocities[i].Lerp(destPoseData->m_angularVelocities[i], weight);
}
}
else
{
// The destination pose data is used while the current one is not. Just copy over the velocities from the destination.
m_velocities = destPoseData->m_velocities;
m_angularVelocities = destPoseData->m_angularVelocities;
}
}
else
{
// Destination pose either doesn't contain velocity pose data or it is unused.
// Don't do anything and keep the current velocities.
}
}
void PoseDataJointVelocities::DebugDraw(AzFramework::DebugDisplayRequests& debugDisplay, const AZ::Color& color) const
{
AZ_Assert(m_pose->GetNumTransforms() == m_velocities.size(), "Expected a joint velocity for each joint in the pose.");
const Pose* pose = m_pose;
for (size_t i = 0; i < m_velocities.size(); ++i)
{
const size_t jointIndex = i;
// draw linear velocity
{
const Transform jointModelTM = pose->GetModelSpaceTransform(jointIndex);
const Transform relativeToWorldTM = pose->GetWorldSpaceTransform(m_relativeToJointIndex);
const AZ::Vector3 jointPosition = relativeToWorldTM.TransformPoint(jointModelTM.m_position);
const AZ::Vector3& velocity = m_velocities[i];
const float scale = 0.15f;
const AZ::Vector3 velocityWorldSpace = relativeToWorldTM.TransformVector(velocity * scale);
DebugDrawVelocity(debugDisplay, jointPosition, velocityWorldSpace, color);
}
}
}
void PoseDataJointVelocities::CalculateVelocity(MotionInstance* motionInstance, size_t relativeToJointIndex)
{
SetRelativeToJointIndex(relativeToJointIndex);
ActorInstance* actorInstance = motionInstance->GetActorInstance();
m_velocities.resize(actorInstance->GetNumNodes());
m_angularVelocities.resize(actorInstance->GetNumNodes());
for (size_t i = 0; i < m_velocities.size(); ++i)
{
Feature::CalculateVelocity(i, m_relativeToJointIndex, motionInstance, m_velocities[i]);
// TODO: Angular velocity not used yet.
}
}
void PoseDataJointVelocities::Reflect(AZ::ReflectContext* context)
{
AZ::SerializeContext* serializeContext = azrtti_cast<AZ::SerializeContext*>(context);
if (serializeContext)
{
serializeContext->Class<PoseDataJointVelocities, PoseData>()->Version(1);
}
}
} // namespace EMotionFX::MotionMatching

@ -0,0 +1,60 @@
/*
* Copyright (c) Contributors to the Open 3D Engine Project.
* For complete copyright and license terms please see the LICENSE at the root of this distribution.
*
* SPDX-License-Identifier: Apache-2.0 OR MIT
*
*/
#pragma once
#include <AzCore/Math/Vector3.h>
#include <AzCore/Math/Quaternion.h>
#include <EMotionFX/Source/PoseData.h>
#include <AzFramework/Entity/EntityDebugDisplayBus.h>
namespace EMotionFX::MotionMatching
{
/**
* Extends a given pose with joint-relative linear and angular velocities.
**/
class EMFX_API PoseDataJointVelocities
: public PoseData
{
public:
AZ_RTTI(PoseDataJointVelocities, "{9C082B82-7225-4550-A52C-C920CCC2482C}", PoseData)
AZ_CLASS_ALLOCATOR_DECL
PoseDataJointVelocities();
~PoseDataJointVelocities();
void Clear();
void LinkToActorInstance(const ActorInstance* actorInstance) override;
void LinkToActor(const Actor* actor) override;
void Reset() override;
void CopyFrom(const PoseData* from) override;
void Blend(const Pose* destPose, float weight) override;
void CalculateVelocity(MotionInstance* motionInstance, size_t relativeToJointIndex);
void DebugDraw(AzFramework::DebugDisplayRequests& debugDisplay, const AZ::Color& color) const override;
AZStd::vector<AZ::Vector3>& GetVelocities() { return m_velocities; }
const AZStd::vector<AZ::Vector3>& GetVelocities() const { return m_velocities; }
const AZ::Vector3& GetVelocity(size_t jointIndex) { return m_velocities[jointIndex]; }
AZStd::vector<AZ::Vector3>& GetAngularVelocities() { return m_angularVelocities; }
const AZStd::vector<AZ::Vector3>& GetAngularVelocities() const { return m_angularVelocities; }
const AZ::Vector3& GetAngularVelocity(size_t jointIndex) { return m_angularVelocities[jointIndex]; }
static void Reflect(AZ::ReflectContext* context);
void SetRelativeToJointIndex(size_t relativeToJointIndex);
private:
AZStd::vector<AZ::Vector3> m_velocities;
AZStd::vector<AZ::Vector3> m_angularVelocities;
size_t m_relativeToJointIndex = InvalidIndex;
};
} // namespace EMotionFX::MotionMatching

@ -0,0 +1,167 @@
/*
* Copyright (c) Contributors to the Open 3D Engine Project.
* For complete copyright and license terms please see the LICENSE at the root of this distribution.
*
* SPDX-License-Identifier: Apache-2.0 OR MIT
*
*/
#include <TrajectoryHistory.h>
#include <EMotionFX/Source/ActorInstance.h>
#include <EMotionFX/Source/TransformData.h>
#include <EMotionFX/Source/EMotionFXManager.h>
namespace EMotionFX::MotionMatching
{
TrajectoryHistory::Sample operator*(TrajectoryHistory::Sample sample, float weight)
{
return {sample.m_position * weight, sample.m_facingDirection * weight};
}
TrajectoryHistory::Sample operator*(float weight, TrajectoryHistory::Sample sample)
{
return {weight * sample.m_position, weight * sample.m_facingDirection};
}
TrajectoryHistory::Sample operator+(TrajectoryHistory::Sample lhs, const TrajectoryHistory::Sample& rhs)
{
return {lhs.m_position + rhs.m_position, lhs.m_facingDirection + rhs.m_facingDirection};
}
void TrajectoryHistory::Init(const Pose& pose, size_t jointIndex, const AZ::Vector3& facingAxisDir, float numSecondsToTrack)
{
AZ_Assert(numSecondsToTrack > 0.0f, "Number of seconds to track has to be greater than zero.");
Clear();
m_jointIndex = jointIndex;
m_facingAxisDir = facingAxisDir;
m_numSecondsToTrack = numSecondsToTrack;
// Pre-fill the history with samples from the current joint position.
PrefillSamples(pose, /*timeDelta=*/1.0f / 60.0f);
}
void TrajectoryHistory::AddSample(const Pose& pose)
{
Sample sample;
const Transform worldSpaceTransform = pose.GetWorldSpaceTransform(m_jointIndex);
sample.m_position = worldSpaceTransform.m_position;
sample.m_facingDirection = worldSpaceTransform.TransformVector(m_facingAxisDir).GetNormalizedSafe();
// The new key will be added at the end of the keytrack.
m_keytrack.AddKey(m_currentTime, sample);
while (m_keytrack.GetNumKeys() > 2 &&
((m_keytrack.GetKey(m_keytrack.GetNumKeys() - 2)->GetTime() - m_keytrack.GetFirstTime()) > m_numSecondsToTrack))
{
m_keytrack.RemoveKey(0); // Remove first (oldest) key
}
}
void TrajectoryHistory::PrefillSamples(const Pose& pose, float timeDelta)
{
const size_t numKeyframes = aznumeric_caster<>(m_numSecondsToTrack / timeDelta);
for (size_t i = 0; i < numKeyframes; ++i)
{
AddSample(pose);
Update(timeDelta);
}
}
void TrajectoryHistory::Clear()
{
m_jointIndex = 0;
m_currentTime = 0.0f;
m_keytrack.ClearKeys();
}
void TrajectoryHistory::Update(float timeDelta)
{
m_currentTime += timeDelta;
}
TrajectoryHistory::Sample TrajectoryHistory::Evaluate(float time) const
{
if (m_keytrack.GetNumKeys() == 0)
{
return {};
}
return m_keytrack.GetValueAtTime(m_keytrack.GetLastTime() - time);
}
TrajectoryHistory::Sample TrajectoryHistory::EvaluateNormalized(float normalizedTime) const
{
const float firstTime = m_keytrack.GetFirstTime();
const float lastTime = m_keytrack.GetLastTime();
const float range = lastTime - firstTime;
const float time = (1.0f - normalizedTime) * range + firstTime;
return m_keytrack.GetValueAtTime(time);
}
void TrajectoryHistory::DebugDraw(AzFramework::DebugDisplayRequests& debugDisplay, const AZ::Color& color, float timeStart) const
{
const size_t numKeyframes = m_keytrack.GetNumKeys();
if (numKeyframes == 0)
{
return;
}
// Clip some of the newest samples.
const float adjustedLastTime = m_keytrack.GetLastTime() - timeStart;
size_t adjustedLastKey = m_keytrack.FindKeyNumber(adjustedLastTime);
if (adjustedLastKey == InvalidIndex)
{
adjustedLastKey = m_keytrack.GetNumKeys() - 1;
}
const float firstTime = m_keytrack.GetFirstTime();
const float range = adjustedLastTime - firstTime;
debugDisplay.DepthTestOff();
for (size_t i = 0; i < adjustedLastKey; ++i)
{
const float time = m_keytrack.GetKey(i)->GetTime();
const float normalized = (time - firstTime) / range;
if (normalized < 0.3f)
{
continue;
}
// Decrease size and fade out alpha the older the sample is.
AZ::Color finalColor = color;
finalColor.SetA(finalColor.GetA() * 0.6f * normalized);
const float markerSize = m_debugMarkerSize * 0.7f * normalized;
const Sample currentSample = m_keytrack.GetKey(i)->GetValue();
debugDisplay.SetColor(finalColor);
debugDisplay.DrawBall(currentSample.m_position, markerSize, /*drawShaded=*/false);
const float facingDirectionLength = m_debugMarkerSize * 10.0f * normalized;
debugDisplay.DrawLine(currentSample.m_position, currentSample.m_position + currentSample.m_facingDirection * facingDirectionLength);
}
}
void TrajectoryHistory::DebugDrawSampled(AzFramework::DebugDisplayRequests& debugDisplay,
size_t numSamples,
const AZ::Color& color) const
{
debugDisplay.DepthTestOff();
debugDisplay.SetColor(color);
Sample lastSample = EvaluateNormalized(0.0f);
for (size_t i = 0; i < numSamples; ++i)
{
const float sampleTime = i / static_cast<float>(numSamples - 1);
const Sample currentSample = EvaluateNormalized(sampleTime);
if (i > 0)
{
debugDisplay.DrawLine(lastSample.m_position, currentSample.m_position);
}
debugDisplay.DrawBall(currentSample.m_position, m_debugMarkerSize, /*drawShaded=*/false);
lastSample = currentSample;
}
}
} // namespace EMotionFX::MotionMatching

@ -0,0 +1,63 @@
/*
* Copyright (c) Contributors to the Open 3D Engine Project.
* For complete copyright and license terms please see the LICENSE at the root of this distribution.
*
* SPDX-License-Identifier: Apache-2.0 OR MIT
*
*/
#pragma once
#include <AzCore/Math/Vector3.h>
#include <AzCore/Math/Color.h>
#include <AzFramework/Entity/EntityDebugDisplayBus.h>
#include <EMotionFX/Source/Pose.h>
#include <EMotionFX/Source/KeyTrackLinearDynamic.h>
namespace EMotionFX::MotionMatching
{
//! Used to store the trajectory history for the root motion (motion extraction node).
//! The trajectory history is independent of the trajectory feature and captures a sample with every engine tick.
//! The recorded history needs to record and track at least the time the trajectory feature/query requires.
class EMFX_API TrajectoryHistory
{
public:
void Init(const Pose& pose, size_t jointIndex, const AZ::Vector3& facingAxisDir, float numSecondsToTrack);
void Clear();
void Update(float timeDelta);
void AddSample(const Pose& pose);
struct EMFX_API Sample
{
AZ::Vector3 m_position = AZ::Vector3::CreateZero();
AZ::Vector3 m_facingDirection = AZ::Vector3::CreateZero();
};
//! time in range [0, m_numSecondsToTrack]
Sample Evaluate(float time) const;
//! time in range [0, 1] where 0 is the current character position and 1 the oldest keyframe in the trajectory history
Sample EvaluateNormalized(float normalizedTime) const;
float GetNumSecondsToTrack() const { return m_numSecondsToTrack; }
float GetCurrentTime() const { return m_currentTime; }
size_t GetJointIndex() const { return m_jointIndex; }
void DebugDraw(AzFramework::DebugDisplayRequests& debugDisplay, const AZ::Color& color, float timeStart = 0.0f) const;
void DebugDrawSampled(AzFramework::DebugDisplayRequests& debugDisplay, size_t numSamples, const AZ::Color& color) const;
private:
void PrefillSamples(const Pose& pose, float timeDelta);
KeyTrackLinearDynamic<Sample> m_keytrack;
float m_numSecondsToTrack = 0.0f;
size_t m_jointIndex = 0;
float m_currentTime = 0.0f;
AZ::Vector3 m_facingAxisDir; //! Facing direction of the character asset. (e.g. 0,1,0 when it is looking towards Y-axis)
static constexpr float m_debugMarkerSize = 0.02f;
};
} // namespace EMotionFX::MotionMatching

@ -0,0 +1,163 @@
/*
* Copyright (c) Contributors to the Open 3D Engine Project.
* For complete copyright and license terms please see the LICENSE at the root of this distribution.
*
* SPDX-License-Identifier: Apache-2.0 OR MIT
*
*/
#include <TrajectoryQuery.h>
#include <EMotionFX/Source/ActorInstance.h>
#include <FeatureTrajectory.h>
namespace EMotionFX::MotionMatching
{
AZ::Vector3 SampleFunction(TrajectoryQuery::EMode mode, float offset, float radius, float phase)
{
switch (mode)
{
case TrajectoryQuery::MODE_TWO:
{
AZ::Vector3 displacement = AZ::Vector3::CreateZero();
displacement.SetX(radius * sinf(phase + offset) );
displacement.SetY(cosf(phase + offset));
return displacement;
}
case TrajectoryQuery::MODE_THREE:
{
AZ::Vector3 displacement = AZ::Vector3::CreateZero();
const float rad = radius * cosf(radius + phase*0.2f);
displacement.SetX(rad * sinf(phase + offset));
displacement.SetY(rad * cosf(phase + offset));
return displacement;
}
case TrajectoryQuery::MODE_FOUR:
{
AZ::Vector3 displacement = AZ::Vector3::CreateZero();
displacement.SetX(radius * sinf(phase + offset));
displacement.SetY(radius*2.0f * cosf(phase + offset));
return displacement;
}
// MODE_ONE and default
default:
{
AZ::Vector3 displacement = AZ::Vector3::CreateZero();
displacement.SetX(radius * sinf(phase * 0.7f + offset) + radius * 0.75f * cosf(phase * 2.0f + offset * 2.0f));
displacement.SetY(radius * cosf(phase * 0.4f + offset));
return displacement;
}
}
}
void TrajectoryQuery::Update(const ActorInstance* actorInstance,
const FeatureTrajectory* trajectoryFeature,
const TrajectoryHistory& trajectoryHistory,
EMode mode,
[[maybe_unused]] AZ::Vector3 targetPos,
[[maybe_unused]] AZ::Vector3 targetFacingDir,
float timeDelta,
float pathRadius,
float pathSpeed)
{
// Build the future trajectory control points.
const size_t numFutureSamples = trajectoryFeature->GetNumFutureSamples();
m_futureControlPoints.resize(numFutureSamples);
if (mode == MODE_TARGETDRIVEN)
{
const AZ::Vector3 curPos = actorInstance->GetWorldSpaceTransform().m_position;
if (curPos.IsClose(targetPos, 0.1f))
{
for (size_t i = 0; i < numFutureSamples; ++i)
{
m_futureControlPoints[i].m_position = curPos;
}
}
else
{
// NOTE: Improve it by using a curve to the target.
for (size_t i = 0; i < numFutureSamples; ++i)
{
const float sampleTime = static_cast<float>(i) / (numFutureSamples - 1);
m_futureControlPoints[i].m_position = curPos.Lerp(targetPos, sampleTime);
}
}
}
else
{
static float phase = 0.0f;
phase += timeDelta * pathSpeed;
AZ::Vector3 base = SampleFunction(mode, 0.0f, pathRadius, phase);
for (size_t i = 0; i < numFutureSamples; ++i)
{
const float offset = i * 0.1f;
const AZ::Vector3 curSample = SampleFunction(mode, offset, pathRadius, phase);
AZ::Vector3 displacement = curSample - base;
m_futureControlPoints[i].m_position = actorInstance->GetWorldSpaceTransform().m_position + displacement;
// Evaluate a control point slightly further into the future than the actual
// one and use the position difference as the facing direction.
const AZ::Vector3 deltaSample = SampleFunction(mode, offset + 0.01f, pathRadius, phase);
const AZ::Vector3 dir = deltaSample - curSample;
m_futureControlPoints[i].m_facingDirection = dir.GetNormalizedSafe();
}
}
// Build the past trajectory control points.
const size_t numPastSamples = trajectoryFeature->GetNumPastSamples();
m_pastControlPoints.resize(numPastSamples);
const float pastTimeRange = trajectoryFeature->GetPastTimeRange();
for (size_t i = 0; i < numPastSamples; ++i)
{
const float sampleTimeNormalized = i / static_cast<float>(numPastSamples - 1);
const TrajectoryHistory::Sample sample = trajectoryHistory.Evaluate(sampleTimeNormalized * pastTimeRange);
m_pastControlPoints[i] = { sample.m_position, sample.m_facingDirection };
}
}
void TrajectoryQuery::DebugDraw(AzFramework::DebugDisplayRequests& debugDisplay, const AZ::Color& color) const
{
DebugDrawControlPoints(debugDisplay, m_pastControlPoints, color);
DebugDrawControlPoints(debugDisplay, m_futureControlPoints, color);
}
void TrajectoryQuery::DebugDrawControlPoints(AzFramework::DebugDisplayRequests& debugDisplay,
const AZStd::vector<ControlPoint>& controlPoints,
const AZ::Color& color)
{
const float markerSize = 0.02f;
const size_t numControlPoints = controlPoints.size();
if (numControlPoints > 1)
{
debugDisplay.DepthTestOff();
debugDisplay.SetColor(color);
for (size_t i = 0; i < numControlPoints - 1; ++i)
{
const ControlPoint& current = controlPoints[i];
const AZ::Vector3& posA = current.m_position;
const AZ::Vector3& posB = controlPoints[i + 1].m_position;
const AZ::Vector3 diff = posB - posA;
debugDisplay.DrawSolidCylinder(/*center=*/(posB + posA) * 0.5f,
/*direction=*/diff.GetNormalizedSafe(),
/*radius=*/0.0025f,
/*height=*/diff.GetLength(),
/*drawShaded=*/false);
FeatureTrajectory::DebugDrawFacingDirection(debugDisplay, current.m_position, current.m_facingDirection);
}
for (const ControlPoint& controlPoint : controlPoints)
{
debugDisplay.DrawBall(controlPoint.m_position, markerSize, /*drawShaded=*/false);
FeatureTrajectory::DebugDrawFacingDirection(debugDisplay, controlPoint.m_position, controlPoint.m_facingDirection);
}
}
}
} // namespace EMotionFX::MotionMatching

@ -0,0 +1,68 @@
/*
* Copyright (c) Contributors to the Open 3D Engine Project.
* For complete copyright and license terms please see the LICENSE at the root of this distribution.
*
* SPDX-License-Identifier: Apache-2.0 OR MIT
*
*/
#pragma once
#include <AzCore/Math/Vector3.h>
#include <AzCore/Math/Color.h>
#include <AzFramework/Entity/EntityDebugDisplayBus.h>
#include <EMotionFX/Source/Pose.h>
#include <TrajectoryHistory.h>
namespace EMotionFX::MotionMatching
{
class FeatureTrajectory;
//! Builds the input trajectory query data for the motion matching algorithm.
//! Reads the number of past and future samples and the time ranges from the trajectory feature,
//! constructs the future trajectory based on the target and the past trajectory based on the trajectory history.
class EMFX_API TrajectoryQuery
{
public:
struct ControlPoint
{
AZ::Vector3 m_position;
AZ::Vector3 m_facingDirection;
};
enum EMode : AZ::u8
{
MODE_TARGETDRIVEN = 0,
MODE_ONE = 1,
MODE_TWO = 2,
MODE_THREE = 3,
MODE_FOUR = 4
};
void Update(const ActorInstance* actorInstance,
const FeatureTrajectory* trajectoryFeature,
const TrajectoryHistory& trajectoryHistory,
EMode mode,
AZ::Vector3 targetPos,
AZ::Vector3 targetFacingDir,
float timeDelta,
float pathRadius,
float pathSpeed);
void DebugDraw(AzFramework::DebugDisplayRequests& debugDisplay, const AZ::Color& color) const;
const AZStd::vector<ControlPoint>& GetPastControlPoints() const { return m_pastControlPoints; }
const AZStd::vector<ControlPoint>& GetFutureControlPoints() const { return m_futureControlPoints; }
private:
static void DebugDrawControlPoints(AzFramework::DebugDisplayRequests& debugDisplay,
const AZStd::vector<ControlPoint>& controlPoints,
const AZ::Color& color);
AZStd::vector<ControlPoint> m_pastControlPoints;
AZStd::vector<ControlPoint> m_futureControlPoints;
};
} // namespace EMotionFX::MotionMatching

@ -0,0 +1,62 @@
/*
* Copyright (c) Contributors to the Open 3D Engine Project.
* For complete copyright and license terms please see the LICENSE at the root of this distribution.
*
* SPDX-License-Identifier: Apache-2.0 OR MIT
*
*/
#include <Fixture.h>
#include <FeatureMatrix.h>
namespace EMotionFX::MotionMatching
{
class FeatureMatrixFixture
: public Fixture
{
public:
void SetUp() override
{
Fixture::SetUp();
// Construct 3x3 matrix:
// 1 2 3
// 4 5 6
// 7 8 9
m_featureMatrix.resize(3, 3);
float counter = 1.0f;
for (size_t row = 0; row < 3; ++row)
{
for (size_t column = 0; column < 3; ++column)
{
m_featureMatrix(row, column) = counter;
counter++;
}
}
}
FeatureMatrix m_featureMatrix;
};
TEST_F(FeatureMatrixFixture, AccessOperators)
{
EXPECT_FLOAT_EQ(m_featureMatrix(1, 1), 5.0f);
EXPECT_FLOAT_EQ(m_featureMatrix(0, 2), 3.0f);
EXPECT_FLOAT_EQ(m_featureMatrix.coeff(2, 1), 8.0f);
EXPECT_FLOAT_EQ(m_featureMatrix.coeff(1, 2), 6.0f);
}
TEST_F(FeatureMatrixFixture, SetValue)
{
m_featureMatrix(1, 1) = 100.0f;
EXPECT_FLOAT_EQ(m_featureMatrix(1, 1), 100.0f);
}
TEST_F(FeatureMatrixFixture, Size)
{
EXPECT_EQ(m_featureMatrix.size(), 9);
EXPECT_EQ(m_featureMatrix.rows(), 3);
EXPECT_EQ(m_featureMatrix.cols(), 3);
}
} // EMotionFX::MotionMatching

@ -0,0 +1,81 @@
/*
* Copyright (c) Contributors to the Open 3D Engine Project.
* For complete copyright and license terms please see the LICENSE at the root of this distribution.
*
* SPDX-License-Identifier: Apache-2.0 OR MIT
*
*/
#include <Fixture.h>
#include <FeaturePosition.h>
#include <FeatureSchema.h>
#include <FeatureSchemaDefault.h>
#include <FeatureTrajectory.h>
#include <FeatureVelocity.h>
namespace EMotionFX::MotionMatching
{
class FeatureSchemaFixture
: public Fixture
{
public:
void SetUp() override
{
Fixture::SetUp();
m_featureSchema = AZStd::make_unique<FeatureSchema>();
DefaultFeatureSchema(*m_featureSchema.get(), {});
}
void TearDown() override
{
Fixture::TearDown();
m_featureSchema.reset();
}
AZStd::unique_ptr<FeatureSchema> m_featureSchema;
};
TEST_F(FeatureSchemaFixture, AddFeature)
{
m_featureSchema->AddFeature(aznew FeaturePosition());
m_featureSchema->AddFeature(aznew FeatureVelocity());
m_featureSchema->AddFeature(aznew FeatureTrajectory());
EXPECT_EQ(m_featureSchema->GetNumFeatures(), 9);
}
TEST_F(FeatureSchemaFixture, Clear)
{
m_featureSchema->Clear();
EXPECT_EQ(m_featureSchema->GetNumFeatures(), 0);
}
TEST_F(FeatureSchemaFixture, GetNumFeatures)
{
EXPECT_EQ(m_featureSchema->GetNumFeatures(), 6);
}
TEST_F(FeatureSchemaFixture, GetFeature)
{
EXPECT_EQ(m_featureSchema->GetFeature(1)->RTTI_GetType(), azrtti_typeid<FeaturePosition>());
EXPECT_STREQ(m_featureSchema->GetFeature(3)->GetName().c_str(), "Left Foot Velocity");
}
TEST_F(FeatureSchemaFixture, GetFeatures)
{
int counter = 0;
for (const Feature* feature : m_featureSchema->GetFeatures())
{
AZ_UNUSED(feature);
counter++;
}
EXPECT_EQ(counter, 6);
}
TEST_F(FeatureSchemaFixture, FindFeatureById)
{
const Feature* feature = m_featureSchema->GetFeature(1);
const AZ::TypeId id = feature->GetId();
const Feature* result = m_featureSchema->FindFeatureById(id);
EXPECT_EQ(result, feature);
}
} // EMotionFX::MotionMatching

@ -0,0 +1,23 @@
/*
* Copyright (c) Contributors to the Open 3D Engine Project.
* For complete copyright and license terms please see the LICENSE at the root of this distribution.
*
* SPDX-License-Identifier: Apache-2.0 OR MIT
*
*/
#include <AzTest/AzTest.h>
#include <MotionMatchingSystemComponent.h>
#include <Tests/SystemComponentFixture.h>
namespace EMotionFX::MotionMatching
{
using Fixture = ComponentFixture<
AZ::MemoryComponent,
AZ::AssetManagerComponent,
AZ::JobManagerComponent,
AZ::StreamerComponent,
EMotionFX::Integration::SystemComponent,
MotionMatchingSystemComponent
>;
}

@ -0,0 +1,11 @@
/*
* Copyright (c) Contributors to the Open 3D Engine Project.
* For complete copyright and license terms please see the LICENSE at the root of this distribution.
*
* SPDX-License-Identifier: Apache-2.0 OR MIT
*
*/
#include <AzTest/AzTest.h>
AZ_UNIT_TEST_HOOK(DEFAULT_UNIT_TEST_ENV);

@ -0,0 +1,11 @@
/*
* Copyright (c) Contributors to the Open 3D Engine Project.
* For complete copyright and license terms please see the LICENSE at the root of this distribution.
*
* SPDX-License-Identifier: Apache-2.0 OR MIT
*
*/
#include <AzTest/AzTest.h>
AZ_UNIT_TEST_HOOK(DEFAULT_UNIT_TEST_ENV);

@ -0,0 +1,12 @@
#
# Copyright (c) Contributors to the Open 3D Engine Project.
# For complete copyright and license terms please see the LICENSE at the root of this distribution.
#
# SPDX-License-Identifier: Apache-2.0 OR MIT
#
#
set(FILES
Source/MotionMatchingEditorSystemComponent.cpp
Source/MotionMatchingEditorSystemComponent.h
)

Some files were not shown because too many files have changed in this diff Show More

Loading…
Cancel
Save