Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Module installed and enabled but apps not responding to any movement #25

Open
illtellyoulater opened this issue Mar 2, 2017 · 34 comments

Comments

@illtellyoulater
Copy link

illtellyoulater commented Mar 2, 2017

Hi, I have a Moto G (first gen) smartphone with compass and accelerometer sensors.
I installed VirtualSensor module from Xposed Framework, enabled it and rebooted phone.

The Virtual Sensor application correctly shows gyroscope data being updated as I move my phone.
However, any app requiring gyroscope (cardboard, youtube 360) will not respond to any movement at all.

Can you please help understanding where's the problem?
Thanks a lot.

@okejokej
Copy link

okejokej commented Mar 2, 2017

It isn't a problem with your phone, it's just that the module is unfinished as of yet and does not support most apps. It seems that the developer has overcome the major issue but hasn't released a working version yet due to being busy with other stuff.

If you want to play with it a bit, the module works with Cardboard version 1.5. There are also some apps I know of that have their own compass tracking that doesn't require Xposed - VR Cave, Roller Coaster VR, Sites in VR. You can also use compass phone tracking in street view in Google Maps app and Street View app, but only in non-VR mode.

Lets hope developer didn't abandon this module, though:P

@illtellyoulater
Copy link
Author

illtellyoulater commented Mar 3, 2017

@okejokej thanks! Using the 1.5 version of cardboard I could finally experiment with it :).

I could finally get an idea of what it is like, but I did not imagine that the gyroscope emulation via compass & accelerometer would be so jittery.
I think all it would need is some smoothing function and it could be much better, as other apps do with their own emulated gyroscope/rotation sensor.

Another big disappointment is that YouTube 360° videos still don't work...

But then again thanks to @Frazew for coding this and let us experience a glimpse of VR :)

By the way, If you want to experience 360° videos there is this other app which supports it: VR Player Free at https://play.google.com/store/apps/details?id=com.vimersiv.vrplayerfree.

You have to enable compass + accelerometer in the app advanced settings and then set the screen format to "sphere". Only problem is you have to copy-paste the video URL for each video you want to see.

@renantopac
Copy link

Hi, i faced the same issue, after a long research and study i solved, it's simple.
First thanks @Frazew for the greate project it's awesome and the code very organized.
Well, i have a Moto X Play with Android Marshmallow 6.0.1 , i didn't test in another devices because this is the only that i have, but i believe that the solution it will works for all androids with same version.

1 - First i download the source code of github @Frazew, then i created a new project in android studio and recreated the entire project(do not import anything, copy and change the codelines) , i did this because i had a lot of error messages.

2 - After the sucesssful build of project i needed change 2 lines of code in the file XposedMod.java
In the line:
final Class headTransform = XposedHelpers.findClassIfExists("com.google.vrtoolkit.cardboard.HeadTransform", lpparam.classLoader);

and

final Class eye = XposedHelpers.findClassIfExists("com.google.vrtoolkit.cardboard.Eye", lpparam.classLoader);

I changed the "com.google.vrtoolkit.cardboard" to "com.google.vr.sdk.base".

I did this steps and worked for me. my cardboard version is 1.8.

If @Frazew agree's i could attach the apk or/and the project here.

Thank's

@Frazew
Copy link
Owner

Frazew commented Aug 2, 2017

Oh, I'll give it a shot when I can, thank you for the detailed analysis !

@okejokej
Copy link

okejokej commented Aug 2, 2017

Unfortunately the 1.5 "branch" doesn't seem to work on APIs below 23 at all. I've compiled it and while it says "gyroscope: true", all it shows for gyro values is "...". Theoretical values are shown as they should. Same happens with frazew's 1.5 build. Tested on API 22

After getting same build error as everyone I somehow managed to build the project without re-creating it... but I have no idea how, sorry =(

@Frazew
Copy link
Owner

Frazew commented Aug 2, 2017

Indeed, the latest changes I had made were tested on Marshmallow and use a temporary "new" API 23 hook. I have yet to write it for older sdks. Although I haven't purposely broken it, it is very likely that a change I had made (I can't remember right now) had broken the other hooks.

@okejokej
Copy link

okejokej commented Aug 9, 2017

Unfortunately after building the newest sources, on API 22 (TP-Link Neffos C5) I still don't get any gyroscope values, only "theoretical values" seem correct. On 1.4x gyroscope values and theoretical values were identical.

https://user-images.githubusercontent.com/13890794/29138934-6cad4dba-7d45-11e7-9c4a-8747ef30e10c.png

Edit: Never mind, I THINK my phone is the problem here. I just remembered that it has some weird rom and Xposed doesn't work on it without the "Disable resource hooks" option enabled.... so year, that's probably it :D ?

@Frazew
Copy link
Owner

Frazew commented Aug 9, 2017

Do you happen to know the details about your ROM ? Is it open source ?
It looks like the gyroscope is added, but it does not send out any values, which might be caused might a different handling of the sensors within the ROM.

Edit: I just saw that it worked with the previous version, I broke something and that should not happen. That means the problem comes from my changes.

Edit 2: This is a blind "fix" (by the way, could you provide a logcat / xposed log in case it doesn't work), but can you build it again replacing the class fr.frazew.virtualgyroscope.hooks.sensorchange.API18 with this one and modifying this line to new fr.frazew.virtualgyroscope.hooks.sensorchange.API18(lpparam));.

@okejokej
Copy link

okejokej commented Aug 10, 2017

It is not open source - it's not a custom rom.

Unfortunately it still doesn't work

08-10 08:06:36.582 E/Xposed (2745): java.lang.NoSuchMethodError: android.hardware.SystemSensorManager$BaseEventQueue#enableSensor(android.hardware.Sensor,int,int)#exact 08-10 08:06:36.582 E/Xposed (2745): at de.robv.android.xposed.XposedHelpers.findMethodExact(XposedHelpers.java:339) 08-10 08:06:36.582 E/Xposed (2745): at de.robv.android.xposed.XposedHelpers.findAndHookMethod(XposedHelpers.java:176) 08-10 08:06:36.582 E/Xposed (2745): at de.robv.android.xposed.XposedHelpers.findAndHookMethod(XposedHelpers.java:251) 08-10 08:06:36.582 E/Xposed (2745): at fr.frazew.virtualgyroscope.XposedMod.enableSensors(XposedMod.java:102) 08-10 08:06:36.582 E/Xposed (2745): at fr.frazew.virtualgyroscope.XposedMod.handleLoadPackage(XposedMod.java:61) 08-10 08:06:36.582 E/Xposed (2745): at de.robv.android.xposed.IXposedHookLoadPackage$Wrapper.handleLoadPackage(IXposedHookLoadPackage.java:34) 08-10 08:06:36.582 E/Xposed (2745): at de.robv.android.xposed.callbacks.XC_LoadPackage.call(XC_LoadPackage.java:61) 08-10 08:06:36.582 E/Xposed (2745): at de.robv.android.xposed.callbacks.XCallback.callAll(XCallback.java:106) 08-10 08:06:36.582 E/Xposed (2745): at de.robv.android.xposed.XposedBridge$1.beforeHookedMethod(XposedBridge.java:193) 08-10 08:06:36.582 E/Xposed (2745): at de.robv.android.xposed.XposedBridge.handleHookedMethod(XposedBridge.java:720) 08-10 08:06:36.582 E/Xposed (2745): at android.app.ActivityThread.handleBindApplication(<Xposed>) 08-10 08:06:36.582 E/Xposed (2745): at android.app.ActivityThread.access$1500(ActivityThread.java:182) 08-10 08:06:36.582 E/Xposed (2745): at android.app.ActivityThread$H.handleMessage(ActivityThread.java:1535) 08-10 08:06:36.582 E/Xposed (2745): at android.os.Handler.dispatchMessage(Handler.java:111) 08-10 08:06:36.582 E/Xposed (2745): at android.os.Looper.loop(Looper.java:194) 08-10 08:06:36.582 E/Xposed (2745): at android.app.ActivityThread.main(ActivityThread.java:5662) 08-10 08:06:36.582 E/Xposed (2745): at java.lang.reflect.Method.invoke(Native Method) 08-10 08:06:36.582 E/Xposed (2745): at java.lang.reflect.Method.invoke(Method.java:372) 08-10 08:06:36.582 E/Xposed (2745): at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:960) 08-10 08:06:36.582 E/Xposed (2745): at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:755) 08-10 08:06:36.582 E/Xposed (2745): at de.robv.android.xposed.XposedBridge.main(XposedBridge.java:132)

It gives the same error without the fixes you now provided - I should've dumped a log sooner :)

@okejokej
Copy link

okejokej commented Aug 10, 2017

Okay, good news. Firstly, I went back to the most recent source from before the fixes you proposed now. On some chinese site(http://blog.csdn.net/kc58236582/article/details/50237123) I found a variant of the sensors framework for android 5.1, where enableSensor method has an extra int argument (int reservedFlags). I added this extra int to your enableSensors method for API>=18 and it now WORKS. I also applied renantopac's fix for cardboard hook, and cardboard-based apps also work and react to movement now.
Now the problem is, while it reacts to movements mostly correctly, but there are random spikes of gyroscope values (where it's different from the theoretical values) that make it jump all over the place. Theoretical values seem okay - react to movement without any weird spikes.

Edit: okay, only some apps seem to work after all - like Cardboard camera. Oh well.

@Frazew
Copy link
Owner

Frazew commented Aug 10, 2017

Thanks for your help ! I'll add this variant for the enableSensors hook.
I have tested renantopac's fix but it didn't work, I guess as you said it only works with some apps (tested it with youtube). I guess I'll add both hooks.
As for the spikes, it might be that some values are sent when they shouldn't. I'll look into this. If that's not the case, then it might also be that the rotation vector needs a filter too.

@okejokej
Copy link

okejokej commented Aug 10, 2017

One thing I wonder about - shouldn't the Cardboard hook use TYPE_GAME_ROTATION_VECTOR rather than TYPE_ROTATION_VECTOR? The Google Cardboard headset has a built in magnet, and since TYPE_ROTATION_VECTOR sensor uses data from accelerometer, gyroscope AND magnetometer, it'd interfere with the readings. So I think it'd be more natural for google to use TYPE_GAME_ROTATION_VECTOR which does NOT use the magnetometer for its readings.

Here's some conversation I found that would support this hypothesis, albeit a little old and not directly related to Cardboard
https://bugs.chromium.org/p/chromium/issues/detail?id=397824

In the official Cardboard android application the orientation of the phone is retrieved via the Sensor.TYPE_GAME_ROTATION_VECTOR which does NOT use the Magnetometer. This way the Cardboard demo isn't disturbed by the field of the magnet.

Edit: BTW, renantopac's fix does work in youtube 360 videos for me. It works almost perfect in mono mode, but if I turn into cardboard mode, the videos get all distorted (still tracking movements though). So we're definitely onto something :)

@renantopac
Copy link

As for the spikes, it might be that some values are sent when they shouldn't. I'll look into this. If that's not the case, then it might also be that the rotation vector needs a filter too.

I founded recently a project that applys low pass filter in the calculation of sensors, based in a
coefficient, perhaps will help, i didn't implement yet on the Virtual Sensor but the code is:

	public synchronized void update(long t, float x, float y, float z) {
		this.lastT = this.t;

		if (lowPassFilterEnabled && hasBeenSet()) {
	   		this.x=(1-filterCoefficient)*x + filterCoefficient*this.x; 
	   		this.y=(1-filterCoefficient)*y + filterCoefficient*this.y;
	   		this.z=(1-filterCoefficient)*z + filterCoefficient*this.z; 
	   		this.t=t;  			
		} else {
			this.t=t; this.x=x; this.y=y; this.z=z;
		}
		optionalStatsUpdate();
	}

The filterCoefficiente has a dynamic value set by user on the screen, in the test that i did the filterCoefficient value was 0.87(float) and the Rotation Vector got very "Soft".

@okejokej
Copy link

I don't know if those spikes are a matter of filtering. The theoretical values seem to be smooth without any big spikes, but the gyroscope values get huge spikes, like 10+. It's as if the emulated gyroscope is receiving something more than just the theoretical values.

@Frazew
Copy link
Owner

Frazew commented Aug 10, 2017

@okejokej That's a good idea, I'll switch to the GAME_ROTATION_VECTOR.
As for the spikes, the gyroscope already uses a lowpass filter, but it might use some improvements. I plan to use a Kalman filter anyway to make things hopefully much better.
I'll try to see if it's not that some events "leak" and give incorrect values but that seems unlikely.

Edit: Latest sommit 06bfee5 should fix this. I also added a lowpass filter on all sensors, but I can't test it since I'm running Nougat (sigh).

@okejokej
Copy link

Okay, I played a bit with the latest commit, and while it fixes the weird spikes, it causes the cardboard apps that worked before (youtube, cardboard camera) to no longer work. After removing the leak fix (else param.setResult(null);) It works again, but the spikes are also back.

Weeeird.

@okejokej
Copy link

okejokej commented Aug 12, 2017

Unfortunately I'm not much of use as a tester, as for example Xposed doesn't output any logs on my phone :/ Perhaps for my usage I will replace the xposed logging with logcat logging and see what's going on :)

Edit: or not, doesn't seem to work. I guess Xposed is not for my chinese phone after all.

@Frazew
Copy link
Owner

Frazew commented Aug 12, 2017

@okejokej I just had an idea to handle the "leaks". Could you try using the source as of the latest (d458497) commit ?
As for the logs, no problem, I thinks it's weird that your ROM doesn't log things, but I guess that happens (?). It's a weird issue anyway so I don't even think that there are any interesting logs.

@okejokej
Copy link

okejokej commented Aug 12, 2017

Alright! Good news. With your newest commit the leak seems fixed now, spikes are gone without the whole thing stopping working.
Cardboard app itself still doesn't seem to see the gyro, but Youtube and Cardboard Camera work! Right now youtube 360 videos are 100% usable in mono mode with VirtualSensor, but what fun is it if it doesn't work correctly with a VR headset, right?

It reacts to movements too, but it seems that instead of rotating you - camera inside the panorama, it's moving both "you" and the "viewport" itself.

https://i.imgur.com/8XMVmUn.png

Same happens in the Cardboard Camera app.

I did a little experimenting, and commented out the whole Eye section in the cardboard hook, leaving only the headtransform. As result only the "mono" rotation in youtube 360 videos worked, headset mode didn't react to any movement at all.

Then I did the opposite, leaving only the Eye section. It was working the same as with both Eye and headTransform enabled.

So it seems headTransform is used for mono display mode only whereas Eye can be used for both mono- and stereoscopic display modes.

edit: https://developers.google.com/vr/android/reference/com/google/vr/sdk/base/Eye.Type
I don't know if I understand it correctly, but perhaps right now it's working in monocular mode, whereas maybe it should be done separately for Left and Right eye instead of headTransform and just "Eye" in general?

edit2: okay I think I got it wrong about the Left and Right eye - I'm not exactly familiar with how the whole Xposed thing works. It seems however, that the cardboard hook eye part somehow meddles also with the perspective matrix, which appears to also be a 4x4 matrix.

@Frazew
Copy link
Owner

Frazew commented Aug 13, 2017

That's very good news !
Yeah, the Eye hook was pretty much experimental as I didn't know what the matrix represented. It seems to handle perspective indeed, but I don't think I know how to do that. I might just remove that hook as it doesn't work.
Anyway, thank you for your extensive testing and useful insights !

@okejokej
Copy link

okejokej commented Aug 13, 2017

Too bad the Eye hook is required for stereoscopic mode :P But yeah it seems to be a bit tricky - on the other hand, it still reacts to movement, so it's not entirely a lost cause.

com.google.vr.sdk.base.Eye has two 4x4 matrices: eyeView and perspective The hook does indeed only write values to the eyeView, but I noticed if I force it to write to perspective it also reacts to movements, although differently.

com.google.vrtoolkit.cardboard seems to be old, deprecated sdk while com.google.vr.sdk.base is the current one.

I wonder how the Cardboard app does its thing, as it doesn't call either the headTransform or Eye constructors at all.

Edit: oops, accidentally forked the project xD
PS. If you will need any logs from me in the future, please use Log.e instead of XposedBridge logging :)

@okejokej
Copy link

I'm spending whole day meddling with the code and testing - and here's the newest development:
I've made serious mistake during testing previously! Youtube is picking up the emulated gyroscope in monocular mode all by itself - without the cardboard hook enabled at all. Therefore it seems that the headTransform hook is having no effect whatsoever.

The Eye hook on the other hand, is actually messing up stuff that otherwise is already calculated correctly based on the chosen headset parameters, so all that touching Eye fields does is moving around the picture and its perspective inside the viewport, without actually moving the observer's camera - touching Eye is pointless.

Therefore if youtube 360 still isn't(?) working for you in mono mode, it'd appear to be something wrong with the API 23 specific code.

PS. it would be cool to have some more direct form of communication in case we happen to play with this code at the same time :)

@renantopac could you perhaps specify what exactly do you understand by working?

@okejokej
Copy link

@Frazew I wonder, do you have an idea or perhaps a hypothesis on how Cardboard sdk is accessing the sensors?

@Frazew
Copy link
Owner

Frazew commented Aug 14, 2017

@okejokej Yeah, most likely it's using the NDK, which unfortunately cannot be hooked at all (it's native code). That was the case for Pokémon GO for instance, I tried poking around but ultimately, it was all handled by native code.
I'll try to verify if this hypothesis is true, in which case this module won't be able to work for the Cardboard app.

If you want a quite detailed sense of how sensors work in Android, take a look at this and also that.
Note: I thought about trying to move the code over to a Magisk module. Maybe it is possible, but I haven't considered the options here.

As for testing, I can't test this module because I'm now running Nougat. I've tested it once after refactoring most of the code (which led to commit 2cb5f57) and that required going through the process of backing everything up and restoring a previous Marshmallow backup, which takes a while and really is a pain.

I initially also thought that a more direct form of communication would be nice, but that would mean this whole debugging process would no longer be publicly documented and I think this could always be useful for someone. But if you prefer/want a direct form of communication, you can always tell me which one you'd like/prefer to use.

@okejokej
Copy link

okejokej commented Aug 14, 2017

Too bad the sensor usage is so inconsistent across the vr apps. I guess this is a dead end for me, then - pretty much all the apps that interest me, specifically in vr mode, seem to access the sensors natively.

Perhaps it's possible to hook specific apps at where they use the obtained values in their own code (provided it's not all done in native code aswell), but even with a decompilers, the necessary information for doing that is pretty much unobtanium.

Perhaps next time I will just buy the cheapest phone with the full set of required sensor 🤕

@okejokej
Copy link

I wonder, seeing that in non-vr mode apps like youtube access the gyroscope through the Sensors api, and that 360 videos are not 3D anyway... do you think it'd be possible through Xposed to make those apps think the longer screen dimension is half its size, then clone the display on the other half of the screen, to make it somewhat viewable on VR headsets?

@okejokej
Copy link

okejokej commented Aug 16, 2017

Alright, it seems to be done in java and entirely doable to hook specific methods/variable in apps like youtube and probably some others. The downside of this approach is that it would have to be updated every time a new version of an application comes out.

Edit: So far I can tell that injecting the youtube cameraMatrix with new values does do some movement, it works somehow even though it's a private field. The way I tried it for now is a bit too computation heavy as acquiring the sensor values is too complicated, but perhaps I can somehow get the values straight from the sensorchange class rather than through the android sensor api.

I will leave the whole thing for now though, as android studio on my over 10 years old laptop is a pain in the butt, then uploading it on the phone to test every single time, then rebooting it is even worse. From what little research I did it seems to be somewhat possible to install Xposed on an emulator, so when I get my desktop PC up and working again (PSU problems) I will play with that in there.

@Frazew
Copy link
Owner

Frazew commented Aug 17, 2017

I just realised I had forgotten to answer... sorry about that.
It could be possible to sort of scan for functions that seem to deal with sensors/rotation, but many apps (mostly games actually) are built on frameworks like Unity for instance. These guys are native libraries so there's no way to hook anything. I guess that would be the last resort option.
I like the idea of splitting the screen in half, it's interesting. However it's not really feasable as first and foremost, it would require either rendering everything twice, or "copying" the framebuffer over to the other half, which would still be computationally expensive I guess.

Don't hesitate to fork the project if you're making significant changes ;)

@okejokej
Copy link

okejokej commented Aug 18, 2017

@Frazew You're right, splitting the screen was really just a random idea anyway.

I've spent some time analyzing decompiled sources of various apps using google vr, and I've started thinking, that while the idea of hooking HeadTransform is right, the headView matrix is being injected with values at a wrong time. If I understand the current hook right, it is injecting them each time the virtual sensor values update. It is possible that instead it could be right to inject them at the time HeadTransform methods such as getHeadView and the others from either import com.google.vrtoolkit.cardboard or com.google.vr.sdk.base.HeadTransform are being called - which one is being used depends on specific application, I guess.
I'm starting to think that maybe the native sensor code somehow detects when those methods are being called and updates the matrix at the time, invalidating whatever has been injected into it. The HeadTransform class even has "@UsedByNative" annotation in its source .

You're about games, it's probably all native in there, but in vr apps like youtube or cardboard - it seems to all be handled in the java.

Not sure if I can test this hypothesis, I'm not too green with Java, the testing so far I could do thanks to some C++ knowledge and syntax similarities between it an Java :P

@okejokej
Copy link

okejokej commented Aug 20, 2017

Sorry for spamming. After further analysis it's possible I've been wrong and Eye class might be usable for rotation in VR for some apps (like youtube), however it must be a matter of setting the values of more than just the eyeView matrix. The matrix contains rotation and translation of each eye, but at the very least it seems that also the perspective matrix has to be filled in properly.

I think I'm getting addicted to this 🤕

As for general cardboard (and related apps) hook, perhaps it might be possible to somehow, through setting some flag, to force it to use sdk sensor api rather than whatever it's doing through ndk, in result making all the attempts to hook matrices directly no longer necessary. This thought comes from the fact that the most recent Google Vr sdk still contains all the code where it gets the gyroscope through Sensor class, so there must be some way to force it to use it.

@imtlmzamora
Copy link

Sorry for letting my self in; recently after fight over the weekend got to run xposed and install dlthe VS in my galaxy A3 (A300H) cardboard 1.5 works but not 1.8. Am working with the version from xposed, it is the same as the last summit? Also to me it was a LOT shaky so I added the mod gyro nose filter and after configure it that help a little, now I have just horizontal shakes

@Frazew
Copy link
Owner

Frazew commented Aug 29, 2017

@okejokej Thank you very much for your help. I'll push a beta release with the changes already made. When I have the time again, I'll look into the Eye hook to get it to work.
I'm not sure such a flag exists, but a generic hook might be possible, I'm not sure.

@imtlmzamora No, the version from the xposed repo is vastly different from the one here. I'm going to build a new beta release in a few days which should hopefully work better than the one you have tested, please let me know if that's not what happens.

@imtlmzamora
Copy link

@Frazew thank you for you answer; if you don't mind I'm creating a branch to test some ideas over you actual version. By the way you project is awesome!!

@Frazew
Copy link
Owner

Frazew commented Aug 30, 2017

@imtlmzamora No problem, that's the reason this is open source, everyone is welcome to be a part of it !

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants