XR Mind Mapper - progress report

 While I wait for Visual Studio to update, here is a progress report for the XR Mind Mapper project.

In the last post, I ended up with an XRI enabled Unity project that integrates with Oculus device rendering, but allows the use of the XR Device Simulator for faster development.

Since then, I have been working on scaffolding the core behaviour of the project. So far, I have a project that supports the following:

- Generate a connected graph from a hard coded data model

- Automated layout of the graph using Unity physics (spring joints, rigid bodies)

- XRI based selection of graph nodes and edges:




This is basic stuff, but it gives something to build on. 

The user interface for this application will be key to its success. The interface needs to be intuitive, but rich enough to support the creation, layout and display of complex mind maps.  Here are the features that I want to add


Graph Viewing:  

As a user I want to be able to intuitively change how the graph is display using the VR controllers, including being able to rotate, scale and translate the graph. 

Graph Layout:  

As a user, I want rich control over how the graph is displayed, including being able to

  - automatically lay out the graph using a force-weighted algorithm

  - control the position of individual nodes

   -change the density of the graph  

  - align the graph to a plane  

Graph Editing: 

As a user I want to be able to intuitively edit the graph, including being able to

  - Add or delete nodes

   -Add or remove edges between nodes

   -change the displayed text, shape, size and color of a node

 - change the displayed colors and thickness of the graph edge

Graph Persistence

  - As a user, I want to be able to save a graph and load it in a future session.

I think these requirements give a good enough framework to start designing the UI, Next I am going to work on building a UI framework and testing some design patterns.




Comments