måndag 13 juli 2015

Running jMonkeyEngine 3 on Android using AndroidHarnessFragment

There's a new way of running jMonkeyEngine 3 on Android, using fragments. Since I couldn't find a description of how to do it or a use case, I thought I'd write down how I did.
The old (and still functional) used a class called AndroidHarness that extended Activity and contained all the app specific information in it. This post describes how to use the new AndroidHarnessFragment.
  • First of all, AndroidHarnessFragment contains a lot of app-specific settings. The most important of these are appClass, which is a String containing the qualified name of the Application to run. This and a lot of other fields are protected and it seems the intended way of using the class is by extending it. This way you can set all of them in the constructor of the new class, even if there are other ways more aligned with Android conventions (such as using a Bundle).
  • Previously the auto-generated MainActivity class extended AndroidHarness. Now it should just extend Activity.
  • To create the fragment, we can create a layout file for the activity:
<?xml version="1.0" encoding="utf-8"?>
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
android:orientation="vertical"
android:layout_width="fill_parent"
android:layout_height="fill_parent"
>
<fragment android:name="com.mycompany.mygame.MyAndroidHarnessFragment"
android:id="@+id/app_fragment"
android:layout_width="match_parent"
android:layout_height="match_parent" />
</LinearLayout>
  • Finally we tell MainActivity to use the layout file we just created.
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.main);
}
That's all that seems to be needed to run jMonkeyEngine inside a fragment!

söndag 22 februari 2015

Google Cardboard support for jMonkeyEngine

I've just commited support for Google Cardboard in jMonkeyEngine 3. I plan on doing a post with more details of the inner-workings, but in the meantime here's a brief outline and how to use it.

First of all, it's a completely separate integration from the Android VR support project I started. This uses the Google Cardboard API directly and in a way that didn't fit with the architecture in that project.

Why is it useful?

It enables those who wish to use a complete game development package to deploy their application as a Google Cardboard app.

So, how do I use it?

Download the jme-cardboard.jar from the repo and add it to your jMonkeyEngine project.
  1. Turn on Android deployment for your project (Properties/Application/Android)
  2. In the generated MainActivity.java (Important Files/Android Main Activity) have it extend CardboardHarness instead of AndroidHarness.
  3. Change the appClass in the same to your project's application file.
That should be it. For an example application, check out CardboardStarTravel example in the test package.



If you wish to build the sources, it needs to have an android.jar attached to the project.



Known issues.
Drift with the accelerometer is pretty bad. Don't know if there is anything to do about it on the application side.
Movement is fairly jittery. Adding a filter for the accelerometer might be desirable.


Like I said, there will be more to come!

lördag 24 januari 2015

Designing a virtual reality HMD for smart phones

Update: Design is now available on Thingiverse.

This is supposed to be about software, I know. But without proper hardware it's impossible to write any software. I realized when putting together the Android VR library for jMonkeyEngine 3.0 that I had nothing to test it on.
The quickest solution to that problem would be cutting out a Google Cardboard. I'm not particularly fond of cardboard, however and without it being laser-cut it would look terrible. I have the benefit of owning a 3D-printer (Prusa i3) so I thought I'd have a go at designing my own HMD inspired by the Google Cardboard schematics. They can only be used so far, though, since they're meant to be cut and folded. Using a 3D-printer there are benefits of being able to print complex geometry directly.
I decided to build it in 3 steps, each as simple as possible to avoid overhang problems. The first one would be the cradle where the phone would rest.
I based the measurements around my Samsung Galaxy S4 and tried to design it to allow access to the buttons on the side of the phone as well as the USB and audio. In general, I tried to leave as much space as possible on the sides for different phone types.
Printing time is always an issue with hobby printers, which is why I left the back side open. This and to allow the battery some fresh air. There are holes in it as well (the design anyway. The printer doesn't really make them). These are for if at some point one would like to mimick the Oculus Rift DK2's positional tracking by placing some LED's there.

 
Next, I went to the piece next to the eyes as the middle part would just be about creating some distance between the lenses and the screen (or so I thought). I happen to have an Oculus Rift DK1 which isn't seeing much use now, so I decided to butcher two of the eye cups for lenses. They are 36mm in diameter, which should give some additional FOV compared to the 25mm recommended for the Google Cardboard. Apart from making good fittings for the lenses the biggest challenge with this piece was making it fit well around the face.


I think I actually spent most of the time making the middle piece. Modelling a good cup for the nose was a big challenge and I've scrapped several prints due to it not being well printed. It's very spacious, and should suit most nose shape and sizes. The other thing with it is that it's slightly wider at the bottom than at the top. This is because I made the piece next to the face slightly more narrow than the phone cradle. In the case I build a Note-sized cradle, this would be even more pronounced.


Below is the current state of the prototype. It's working very well together with the Google Cardboard demos. I can see some 5 mm outside of my S4 screen at the top and bottom, so maybe an S5 would be perfect for these lenses.



I want to have a fitting for a magnet on it as well but I seem to have lost the magnets I bought, for now..
I plan on sharing these as well as a BOM for a complete HMD once I'm happy with the design. Stay tuned for more.

fredag 2 januari 2015

Virtual Reality for Android using jMonkeyEngine

Oculus Rift may be leading the pack currently, but I'm sure there will be more contenders for the virtual reality throne, shortly. So, while the Oculus Rift plugin was a good start I think it is time to look into what it would take to support more devices. The architecture established for the Oculus Rift plugin is good enough and I decided to see how much effort it would be to implement a basic virtual reality API for Android. After all, the low-budget Google Cardboard probably makes it the most accessible device of all.

You can find the repository for the project here: https://github.com/neph1/jme-androidvr

Usage

It's implemented with ease of usage in mind. An application wishing to use it needs to do two things.
  1. Create a VRAppState instance and supply a suitable HeadMountedDisplay (currently either a DummyDisplay or AndroidDisplay).
AndroidDisplay display = new AndroidDisplay();
VRAppState vrAppState = new VRAppState(display);
stateManager.attach(vrAppState);
  1. For controls, get the StereoCameraControl from the VRAppState and add it as a Control to a Spatial. It will now follow the Spatial through the world.
Node observer = new Node("");
observer.addControl(vrAppState.getCameraControl());
rootNode.attachChild(observer);
See AndroidVRTest for an example implementation. 

Method

Like I stated in the beginning, it follows closely what has already been implemented in the Oculus Rift plugin but classes have been abstracted to allow for more diverse, future, implementations.
It revolves around class called VRAppState. This class sets up two viewports and a StereoCameraControl which handles the two different views.
The StereoCameraControl class gets its data (currently only rotation) from a class implementing a HeadMountedDisplay interface. In this example it's called AndroidDisplay. The AndroidDisplay class accesses the Android application and registers itself as a SensorEventListener for the Accelerometer and Magnetometer. The default update delay is way too slow, so it uses SENSOR_DELAY_GAME instead.
sensorManager = (SensorManager) JmeAndroidSystem.getActivity().getApplication().getSystemService(Activity.SENSOR_SERVICE);
accelerometer = sensorManager.getDefaultSensor(Sensor.TYPE_ACCELEROMETER);
magnetometer = sensorManager.getDefaultSensor(Sensor.TYPE_MAGNETIC_FIELD);
sensorManager.registerListener(this, accelerometer, SensorManager.SENSOR_DELAY_GAME);
sensorManager.registerListener(this, magnetometer, SensorManager.SENSOR_DELAY_GAME);
Once sensor data is updated it's received by the onSensorChanged method. It updates our local values and confirms that data has been received before getting the rotational data of the device in the form of a Matrix. This is stored in a temporary field and then orientation is interpolated towards it. This was due to using the raw data was much too jittery.
public void onSensorChanged(SensorEvent event) {
 if (event.sensor.getType() == Sensor.TYPE_ACCELEROMETER) {
  gravity = event.values;
 } else if (event.sensor.getType() == Sensor.TYPE_MAGNETIC_FIELD) {
  geomagnetic = event.values;
 }
 if (gravity != null && geomagnetic != null) {
  boolean success = SensorManager.getRotationMatrix(R, I, gravity, geomagnetic);
  if (success) {
   SensorManager.getOrientation(R, orientationVector);
   tempQuat.fromAngles(orientationVector[2], -orientationVector[1], orientationVector[0]);
orientation.slerp(tempQuat, 0.2f);
  }
 }
}

It also needs to know about the physical size of the screen. This is used by the distortion shader. With some conversion it can be deducted from the Android applications WindowManager.
DisplayMetrics displaymetrics = new DisplayMetrics();
JmeAndroidSystem.getActivity().getWindow().getWindowManager().getDefaultDisplay().getMetrics(displaymetrics);
float screenHeight = displaymetrics.heightPixels / displaymetrics.ydpi * inchesToMeters;
float screenWidth = displaymetrics.widthPixels / displaymetrics.xdpi * inchesToMeters;
This and other information is stored in a class inspired by the Oculus Rift HMDInfo, called HeadMountedDisplayData. This contains data on the HMD itself, like distance between lenses, distance from screen to lens, resolution, etc.

The shader is using the same principle established early in the Oculus Rift plugin which itself was inspired by an example implementation on the Oculus Developer web site (it seems it has since been removed from the website. If anyone has a link, please let me know). Each display has a post processing filter and the necessary distortion correction is done in a fragment shader. It begins with the class called BarrelDistortionFilter which is instantiated in the VRAppState class.
The BarrelDistortionFilter takes the information from the HeadMountedDisplayData and creates a projection matrix for the Camera associated with its ViewPort. It also prepares some variables for the shader.
The scaleFactor value is an arbitrary number used to fit a specific screen. This most likely needs a formula for different screen sizes.

References

jMonkeyEngine Oculus Rift plugin:
Sensors overview:
Registering sensors and reading orientation data:
Google Cardboard: