VRTracker_Old Calibration

This topic contains 0 replies, has 1 voice, and was last updated by  topherfangio 2 years, 5 months ago.

  • Author
    Posts
  • #4046

    topherfangio
    Participant

    Howdy everyone!

    Since the store was closed, I went ahead and bought all of the parts for the old VRTracker information that I found via Jules’ website. I have the cameras working/updated with the proper firmware and I’ve figured out how to get each camera to connect via websocket to my iPhone running a Unity game and send some super-basic position data (if I move left the Unity player moves left, but the scale is all wrong).

    Now, right now, I do not have any sort of Gateway; I was hoping I could use the phone itself. I realize that I can run OpenCV on an iPhone/Android, however that requires the commercial license which costs $200 and that’s simply not in my budget.

    So, my questions are this:

    1. Am I forced to have a gateway running OpenCV constantly, or could I use OpenCV to get the intrinsic/extrinsic matrices and then just dump those into my Unity program to covert the 2d point to a 3d point?
    2. Either way, do you have some example code that I can use as a starting point (particularly the gateway)?

    As a note, I don’t have the “tags”, gateway or the exact same camera that come with the new VRTracker. I’m just using the Pixy cam (from IRLock) with a NodeMCU module for the wifi and I’ve currently built my own hodge-podge Websocket setup so the cameras can communicate with Unity.

    Thanks so much! This is such an awesome project!

You must be logged in to reply to this topic.