1. Do not share user accounts! Any account that is shared by another person will be blocked and closed. This means: we will close not only the account that is shared, but also the main account of the user who uses another person's account. We have the ability to detect account sharing, so please do not try to cheat the system. This action will take place on 04/18/2023. Read all forum rules.
    Dismiss Notice
  2. For downloading SimTools plugins you need a Download Package. Get it with virtual coins that you receive for forum activity or Buy Download Package - We have a zero Spam tolerance so read our forum rules first.

    Buy Now a Download Plan!
  3. Do not try to cheat our system and do not post an unnecessary amount of useless posts only to earn credits here. We have a zero spam tolerance policy and this will cause a ban of your user account. Otherwise we wish you a pleasant stay here! Read the forum rules
  4. We have a few rules which you need to read and accept before posting anything here! Following these rules will keep the forum clean and your stay pleasant. Do not follow these rules can lead to permanent exclusion from this website: Read the forum rules.
    Are you a company? Read our company rules

OPEN VR CODER NEEDED !! PLEASE HELP !!

Discussion in 'VR Headsets and Sim Gaming - Virtual Reality' started by SilentChill, Mar 24, 2019.

  1. Psionic001

    Psionic001 Active Member Gold Contributor

    Joined:
    Mar 5, 2017
    Messages:
    138
    Location:
    Sydney
    Balance:
    1,002Coins
    Ratings:
    +59 / 1 / -0
    My Motion Simulator:
    Motion platform, 6DOF
    I’d imagine the IMU in the hand controller would be like any other and output position data thousands of times per second.
    I’m not a coder but I’d imagine it would be fairly trivial to add a filter to smooth those jitters.

    Further to the above, a dedicated USB IMU device, providing accel and gyro data, might be easier to install and a developer could work out a way integrate the data stream to subtract position data out. If it came with a GUI interface the end user could also enter the movement extents of their unique platform from XYZ zero to fine tune exactly the movements that need to be cancelled out.
    Last edited: Apr 23, 2019
  2. SilentChill

    SilentChill Problem Maker

    Joined:
    Jul 19, 2014
    Messages:
    2,619
    Occupation:
    Railway Maintenance
    Location:
    Morecambe, Lancashire, England
    Balance:
    20,396Coins
    Ratings:
    +3,480 / 34 / -0
    My Motion Simulator:
    DC motor, Arduino, Motion platform, 6DOF
    I thought it would be easy but so far no one has been able to do anything about it :(

    We live in hope that someone can though :)
  3. othmane

    othmane New Member

    Joined:
    Apr 23, 2019
    Messages:
    2
    Balance:
    77Coins
    Ratings:
    +0 / 0 / -0
    My Motion Simulator:
    3DOF
    hi witch VR headset do you have?
  4. SilentChill

    SilentChill Problem Maker

    Joined:
    Jul 19, 2014
    Messages:
    2,619
    Occupation:
    Railway Maintenance
    Location:
    Morecambe, Lancashire, England
    Balance:
    20,396Coins
    Ratings:
    +3,480 / 34 / -0
    My Motion Simulator:
    DC motor, Arduino, Motion platform, 6DOF
    Got an Odyssey Plus now I use to have a CV1
  5. BlazinH

    BlazinH Well-Known Member

    Joined:
    Oct 19, 2013
    Messages:
    2,145
    Location:
    Oklahoma City, USA
    Balance:
    16,568Coins
    Ratings:
    +1,831 / 32 / -1
    • Like Like x 1
  6. Dirty

    Dirty Well-Known Member Gold Contributor

    Joined:
    Oct 15, 2017
    Messages:
    736
    Occupation:
    All the way up front.
    Location:
    Germany
    Balance:
    7,826Coins
    Ratings:
    +859 / 2 / -0
    I too would think that the right dose of low pass filtering would do the trick to get rid of unwanted vibrations. However, the "lead-block-on-dried-chewing-gum-method" is not a bad idea either :)
  7. Dirty

    Dirty Well-Known Member Gold Contributor

    Joined:
    Oct 15, 2017
    Messages:
    736
    Occupation:
    All the way up front.
    Location:
    Germany
    Balance:
    7,826Coins
    Ratings:
    +859 / 2 / -0
    I would too! :thumbs

    If we could get enough people to chip in, we could get a 10.000$ budget together and hire a developer. Sounds like a lot of money, but let's be honest here for a moment: ALL of us have spent insane amounts on hardware already. So much, that we barely dare telling our wives :) ...uh, I heard,... ...from a friend!

    If everyone on these forums who is seriously interested in VR motion compensation were to donate the equivalent of one days salary, we could have a professional solution in 3-6 months. Especially considering that both these open source projects...
    https://github.com/matzman666/OpenVR-InputEmulator/releases/tag/v1.3
    https://github.com/matzman666/OpenVR-AdvancedSettings
    ...already contain features that only need to be adapted for our use case.

    And these Swiss guys have a professional solution in place already. I talked to one of them at the AERO-fair in EDNY two weeks ago.

    For the next 12-18 months I will be busy with writing my software, but after that I will certainly look into the option of doing motion compensation myself. It is not THAT hard. I looked into the openVR API the other day and even though I am only a novice programmer I got a vague idea of the thing.

    So, if there is someone out there with a little C++ experience, we should be able to come up with a solution that fits the needs of motion simulation. Here's a good place to start

    Cheers, Dirty :)
    • Like Like x 2
    Last edited: May 1, 2019
  8. ykeara

    ykeara New Member

    Joined:
    Apr 7, 2019
    Messages:
    5
    Balance:
    58Coins
    Ratings:
    +6 / 0 / -0
    well can't post external links but wrong advanced Settings. I might be a bit biased though XD.

    That being said you are correct on a paper level its actually a fairly simple problem... The problem I have been having with is 100% to due with vibrations... the tracking data of WMR and LH based tracking (and presumably rift) becomes heavily un-reliable w/ vibration... and Then you have the added complexity of reducing the cost of the system.

    TBH I probably need to just quit trying to think of a great algorithm, and just move on to experimental trials/data collection. only real Idea I have in that regards is smoothing based on previous updates.... the problem here is this could easily accumulate drift.

    based on helicopter video they aren't really dealing w/ vibration from what I saw... I believe Input emulator does already have that in just doesn't handle vibration well... In an ideal world we have a tracker on the platform w/o vibrations and then it would heavily simplify code even then its not a perfect solution with HMD possibly experiencing the tracking inconsistency... I could "lock" the head's translation.... but I think that would be fairly immersion breaking If I just left it at 3 dof.

    tl;dr I am not sure how to deal w/ tracking more or less failing w/ vibration from a software level.

    so would be interesting to have say a vive tracker mounted on some rubber/foam to deal w/ higher frequency vibrations well... but the low frequency that you want could be difficult to handle w/o a fairly extensive system...

    If you have ideas shoot them here, I don't do a great job checking, but I do look from time to time.
    • Like Like x 1
  9. Dirty

    Dirty Well-Known Member Gold Contributor

    Joined:
    Oct 15, 2017
    Messages:
    736
    Occupation:
    All the way up front.
    Location:
    Germany
    Balance:
    7,826Coins
    Ratings:
    +859 / 2 / -0
    Hey :)

    AFAIK the "OpenVR advanced settings" has a feature that's called "floor calibration" or "floor reference" or something. It tells the runtime where the floor is, so you have a correct eye height all the time. I thought maybe this could be modified for our use case. After all, we just want to tell the runtime where our "relative floor" (our platform) is :)

    You're right. It is a simple problem on paper. A bit of a more complicated problem in reality. Nonetheless solvable!
    So far, I'm even struggling with getting the sample code from the openVR simple examples to compile. I get a couple of linker Errors :-(

    I'd first try to come up with a solution regarding the motion compensation itself (no vibrations), then apply some lowpass filtering to smoothen out the position and orientation, maybe in combination with the absorption gel Pads that @apointner suggested in another thread. I'd love to play around with it, if I could only get those OpenVR examples to work....

    There is something else that I think has the potential to be a permanent fix, at least for vibrations:
    Instead of using a single Vive tracker as platform reference and then use this trackers position and orientation to do ALL the compensation, we could use THREE trackers spread out over the platform at 120° angles. For position and orientation I'd first take the average of the three measurements. I think it is highly unlikely that all three are shaking in exactly the same way, so that should be a pretty good position and orientation already.

    Then on top of that, we could take the three positions and create the orientation by creating a plane between them. This "orientation from 3 positions" should be virtually immune to vibrations in the very first place. And then maybe filter that in with the other data (Kalman filter?)

    All in all, a challenging problem, I admit. But not hopeless :)

    Dirty :)
    • Like Like x 1
  10. ykeara

    ykeara New Member

    Joined:
    Apr 7, 2019
    Messages:
    5
    Balance:
    58Coins
    Ratings:
    +6 / 0 / -0
    Floor fix could work for your issue, but the reset seated position should do the same as I imagine most/all of the sims are set-up in seated mode. And how OVRAS does movement is considerably different then input emulator so they are not really compatible.

    and w/ advanced settings our motion gives us control of 4 axes, pitch and roll are not possible. so input emulator is the only way I know...

    and Again my understanding was the Input Emulator's motion cancellation worked just fine.... just has an issue w/ vibration? correct me if I am wrong. ( I think it also doesn't handle rotation)

    yes w/ 3 trackers we could set good known distances away from each other, and get MUCH better estimation..... while we are at the 2.0 LH's operate at 100 hz and should also have less error/update.

    problem is you are now Talking about a ~600$ setup w/o HMD, and it would more or less require LH tracking

    Quick testing with what I have on hand I think the rotational measurements are accurate enough through vibrations
    • Like Like x 1
  11. Dirty

    Dirty Well-Known Member Gold Contributor

    Joined:
    Oct 15, 2017
    Messages:
    736
    Occupation:
    All the way up front.
    Location:
    Germany
    Balance:
    7,826Coins
    Ratings:
    +859 / 2 / -0
    Indeed! :)
    It is certainly a way more expensive setup. And with 3 trackers, 2 handheld controllers and a HMD we have six tracked objects already. I'm not sure exactly, but it might mean that LH 2.0 would be needed. Correct me if I'm wrong.

    I heard that there are still problems with Pimax for example. Maybe someone can clarify, because I haven't had a chance to try out my 5K BE yet. If I understood correctly, the two displays are angled towards each other, so if you apply a correction to the viewpoint, it also skews the image away from being perfectly superimposed. That is at least my guess to explain the reported problems with views being corrected out of center. Anyways, I'd love to dive in a lot deeper as soon as I have more time.

    If the VR Input Emulator works, and jitter is the main problem, then I think some gentle filtering in code and those gel pads should be a great improvement. Other materials to bed the tracker in could be Play-Doh, or Kinetic sand.

    Dirty :)
  12. ykeara

    ykeara New Member

    Joined:
    Apr 7, 2019
    Messages:
    5
    Balance:
    58Coins
    Ratings:
    +6 / 0 / -0
    No SteamVR/OpenVR is currently set up to handle 64 objects regardless of 1.0 or 2.0 tracking... I think their dongle's only have 40 channels though.

    I don't think Pimax displays are canted (at least lenses) can't speak as a whole... I should also mention that Pimax hasn't been entirely compliant w/ openVR... so it could be more problematic to do some changes

    Input emulator only does translation not rotation so its not perfect. what I would use from a starting point though.
    • Like Like x 1
  13. ykeara

    ykeara New Member

    Joined:
    Apr 7, 2019
    Messages:
    5
    Balance:
    58Coins
    Ratings:
    +6 / 0 / -0
    Specifically the Best I have atm

    is given a Set-up w/ n trackers

    make 2 virtual trackers.

    1 @ base hmd pos
    1 @ center of tracker array

    use `k_pch_TrackingOverride_Section` to override HMD tracking w/ base hmd pos virtual tracker

    now every frame/update Rot of HMD pos virtual tracker = HMD RoT - Tracker Array RoT...

    This should keep your Head in orientation with the car w/o locking it in a position. The positional offsets should be applied similiarly but Need to work out the math for it via quaternions probably.

    AND you need to adjust rotation of the array based on tracker locations. soo need to think out the math.
    • Like Like x 1
  14. Gabor Pittner

    Gabor Pittner Active Member

    Joined:
    Oct 25, 2018
    Messages:
    190
    Location:
    Szekesfehervar Hungary
    Balance:
    1,294Coins
    Ratings:
    +84 / 0 / -0
    My Motion Simulator:
    6DOF
    Heave is needed so much... how can I check if my front wing and suspension are good condition if I cant lift my viewpoint up? ;) And sometimes its also needed in an airplane to view out. Some cars have safety nets the sides of the seat, I couldn't see side mirrors correctly, its needed too.... there are several examples :roll
    • Agree Agree x 1
  15. yobuddy

    yobuddy Well-Known Member Staff Member Moderator SimAxe Beta Tester SimTools Developer Gold Contributor

    Joined:
    Feb 9, 2007
    Messages:
    5,133
    Occupation:
    Computer Technician
    Location:
    Portland, Oregon - USA
    Balance:
    47,906Coins
    Ratings:
    +5,027 / 16 / -0
    Maybe do it in hardware, rather than software?
    Using the same idea of these cheep diy headsets, can't we make a diy tracker with an Arduino just for motion cancellation?
    https://hackaday.com/2014/06/13/openvr-building-an-oculus-rift-for-only-150/

    Then manipulating the sensor output from the Arduino should be fairly strait forward I think?
    (Forgive me if I'm way off the mark, as I currently don't have a headset to play with.)
    • Creative Creative x 1
  16. dododge

    dododge Active Member Gold Contributor

    Joined:
    Mar 8, 2015
    Messages:
    100
    Balance:
    854Coins
    Ratings:
    +71 / 0 / -0
    Those cheap IMU-based designs generally only track angular motion reliably (if even that). Motion cancellation also needs to deal with position changes.

    You can supposedly estimate positional tracking with an IMU that has acceleration sensors, but I'm pretty sure they drift a lot. All of the 6DOF VR headsets use some sort of external reference for positional tracking correction: DK2/Rift have stationary external cameras; Vive/Pimax/Index use stationary laser beacons; WMR/Rift S/Quest use outward-facing cameras that build some sort of visual model of the surroundings; and there are commercial systems for warehouse-sized spaces that I think use stationary radio beacons.

    If you have the Vive "base station" laser emitters you can get a cheap TS4231 module which can handle positional tracking of a photodiode and can be attached to an Arduino -- but if you've got the base stations anyway then it's probably a lot easier to just buy a tracking puck, which has photodiodes facing multiple directions and can report full 6DOF information to an application via the usual VR APIs.

    A single Vive puck should be sufficient, it's just that the excessive vibrations in driving rigs can cause issues with the tracking. I suspect one issue is that the tracking itself might miscalculate if a fast/sharp vibration causes the photodiode to move too much between the two(?) laser sweep detection events that are needed to compute the position. Even if the tracking is working perfectly, you can end up with transducer vibrations causing the in-game camera to shake rapidly, which makes it hard to see the game. Adding some sort of damping/smoothing filter to the tracking data before it gets used for compensation would probably be enough.
    • Agree Agree x 2
    • Like Like x 1
  17. yobuddy

    yobuddy Well-Known Member Staff Member Moderator SimAxe Beta Tester SimTools Developer Gold Contributor

    Joined:
    Feb 9, 2007
    Messages:
    5,133
    Occupation:
    Computer Technician
    Location:
    Portland, Oregon - USA
    Balance:
    47,906Coins
    Ratings:
    +5,027 / 16 / -0
    I would not use a IMU at all thou. (well maybe for self tuning or something)
    I would add the motion calculation directly to the motion interface itself.

    Let's say you use an Arduino as the interface for the simulator.
    The Arduino has the "pots" attached to it and it knows right where the sim is at all times already for motion.
    That means we could use this same "pot" information (reformatted as needed) to be send as motion cancellation data.
    I don't see how it could ever be wrong, as its whats controlling the sims motion also.

    I'd have to think it over more for a more complete idea.
    But I don't see why it could not be added to the sims interface itself with smooting or deadzone etc...
    Take care,
    yobuddy
    • Like Like x 2
  18. Dirty

    Dirty Well-Known Member Gold Contributor

    Joined:
    Oct 15, 2017
    Messages:
    736
    Occupation:
    All the way up front.
    Location:
    Germany
    Balance:
    7,826Coins
    Ratings:
    +859 / 2 / -0
    IMUs like this one from Adafruit (together with BOSCH) have come a long way. Look at this:
    Bildschirmfoto 2019-05-18 um 10.43.10.png
    ...it has integrated sensor fusion :) which is a huge deal for DIYers like ourselves. But indeed, it does NOT provide accurate position natively. Only Orientation and accelerations. If someone wants to turn that data into position information through integration (which is possible), I'd suggest you learn how to juggle Kalman filters and I predict it is gonna be a long and stony path.

    Does the Arduino REALLY know the orientation and position? Afaik, it only knows the pots and thereby the length of every leg.

    If you want to derive the actual platform pose (position & orientation) from those, you will have to apply forward kinematics. Now, the thing with fwd kinematics is that you will get more than one solution. I think I read somewhere that there can be up to 24 possible solutions to the equations of motion.
    Meaning: There is more than one pose that fulfils the condition of the legs being a certain length. Not impossible, but computationally challenging.
    Probably the most obvious example:
    Bildschirmfoto 2019-03-29 um 14.43.53.png
    The Arduino does not even know if the platform is above or below the ground. Totally obvious for humans, but computationally you will have to find a way to resolve this ambiguity. There are also 6 ways to orient the platform 90°off in the vertical :) Not impossible, just a long and stony path.

    There is another (similarly stony) path to the actual pose: The motion cueing software knows at least the commanded pose. If we could model the system behaviour of the platform with sufficient precision, we could approximate the actual pose. Maybe that is what these guys from HeadWay call "advanced mathematic formulas"?
    Bildschirmfoto 2019-05-18 um 11.37.46.png
    This would require a substantial amount of expertise with mechanical systems and their modelling. Have I mentioned the term "long and stony path" in this post already? Again, not impossible, but a lot of work. Most of all, a lot of work that has to be done for every individual sim.

    As I have postet to @yobuddy in a PM earlier today, I see four major options to get motion compensation:
    1. OpenVR Input Emulator. Maybe developing it further.
    2. Ask game developers for native support of motion compensation. I talked to the X-Plane guys and they were pretty positive about the idea.
    3. Ask HMD manufacturers for native support of motion compensation (Rift, Valve, Pimax). Might be difficult, because we are such a small crowd and they're concerned with anti-cheat.
    4. Measure the platform externally and use existing game APIs to correct the viewpoint. Possible in X-Plane for example. Lots of Math, so only my last resort.
    ..the truly tragic thing about this is, that game or hardware devs could suuuuper easily support this feature natively. My personal favorite is to keep developing the InputEmulator further,...

    Dirty :)
    • Like Like x 2
    • Winner Winner x 1
  19. noorbeast

    noorbeast VR Tassie Devil Staff Member Moderator Race Director

    Joined:
    Jul 13, 2014
    Messages:
    20,460
    Occupation:
    Innovative tech specialist for NGOs
    Location:
    St Helens, Tasmania, Australia
    Balance:
    144,596Coins
    Ratings:
    +10,741 / 52 / -2
    My Motion Simulator:
    3DOF, DC motor, JRK
    I am with @Dirty on this one, and have previously suggested that as the code for OpenVR Input Emulator is available via github it can either be incorporated into SimTools or further developed. And in my view that the best way forward for motion cancellation at this particular point in time, given the fractured nature of VR HMD tracking solutions. The biggest challenge is building in some smoothing to account for transducers.

    At worst it preferences SteamVR HMDs, but in some ways that is not such a bad thing, as both the Pimax (with user fiddling) and Valve Index are arguably the best options for motion sims, though the resolution of Reverb is also a strong contender, but inside out tracking somewhat complicates the issue for large axis movement sims, but at least it also uses SteamVR.
    • Like Like x 2
  20. apointner

    apointner Siddhartha

    Joined:
    Aug 16, 2014
    Messages:
    63
    Location:
    N 48° 9'0.88" E 12° 5'45.84"
    Balance:
    764Coins
    Ratings:
    +17 / 3 / -0
    My Motion Simulator:
    3DOF, DC motor, Arduino, Motion platform
    Yeah i´m with the OpenVR Input Emulator also. This thing works perfect with just on sensor, without any lag or any misbehaviors and with 360° 6DOF. Even if traking is lost and regained, it works. The vibration problem is not a big deal for a coder. But to adapt it to every steam HMD will be a neverending challange, couse we need the Interfaces to e.g. PiTool. Every (Steam,HMD) Update could break everthing. And then the best coder can do nothing if vive, pimax, oculus don´t support him. So: may someone here has real good relations to steam core coders? :D:D
    • Like Like x 2