google-project-tango

Camera-Offset | Project Tango

老子叫甜甜 提交于 2019-12-13 15:33:18
问题 I am developing an augmented reality app for Project Tango using Unity3d. Since I want to have virtual object interact with the real world, I use the Meshing with Physics scene from the examples as my basis and placed the Tango AR Camera prefab inside of the Tango Delta Camera (at the relative position (0,0,0)). I found out, that I have to rotate the AR Camera up by about 17deg, so the Dynamic mesh matches the room, however there is still a significant offset to the live preview from the

Drift Correction update of Project Tango after Google IO

谁都会走 提交于 2019-12-13 06:13:15
问题 I am looking for the drift correction update for the project tango APIs after the presentation on the Google IO 2016. You can find the video at this link. The drift correction update is presented about 22:00 min. I hoped this function would be available after the big Okul update on June 9th, but I can't find it in any API. Does anyone one when this function will be available? The screenshot below shows what I'm looking for. The KEY_BOOLEAN_ENABLE_DRIFT_CORRECTION isn't available in any of the

Saving Tango camera data as an image

一世执手 提交于 2019-12-13 03:54:28
问题 I'd like to save the camera data from the Tango camera as an image file. I'm not sure where to start, the closest question I could find is this: Getting Tango's camera stream data Other questions+answers look like they are out of date. Is this applicable to me? Or can I just get the texture from ITangoCameraTexture and save that as a image file? Also is there a way to set the frame rate of the Tango camera? 回答1: Your script should inherit ITangoVideoOverlay and implement

Fail to improve pose of point cloud with ADF origin

若如初见. 提交于 2019-12-13 02:11:45
问题 I save the point clouds of a scene and its quaternion in a pcl file. First, I've only used pose w.r.t to device start (see second image) to get the quaternion. I've discovered a drifting problem which I mentioned here. Therefore, I learned the scene with area learning (see first image) by walking around the table. After that, I'm loading the area description file (ADF) to overcome the drifting. I wait for the first loop closure/localization in the onPoseAvailable callback. Then in the

How to I get more reliable Y position tracking for the Google Tango in Unity?

此生再无相见时 提交于 2019-12-13 00:55:13
问题 We have a unity scene that uses arealearning which has been extremely reliable and consistent about XZ position. However we are noticing that sometimes the tango delta camera’s Y position will "jump up" very high in the scene. When we force the tango to relocalize (by covering the sensors for a few seconds), the Y position remains very off. At other times, the Y position varies a 0.5 - 1.5 unity units when we first start up our Unity app on the tango and are holding it in the exact same

How to detect floor and other surfaces in Google Tango?

拟墨画扇 提交于 2019-12-13 00:06:49
问题 I'm new to Tango and I wanted to scan a room to detect walls and color them red detect floors and color blue I reviewed the Tango tutorial where you can place a cat. Looks like there's a FindPlane function that takes a touch position. Is this something I can use to distinguish walls from floors? 回答1: Did you find the Floor Finding Example. Also, the Java API since Caporales has better 2D Floor Plan extracion. I've not seen anything existing about detecting Walls I'm afraid. Once you've

How to detect an absolute rotation of an android device

强颜欢笑 提交于 2019-12-12 12:49:01
问题 I'm looking to implement a robot that is navigating indoor, using a software running on an android device. One mandatory feature is to know the robot orientation in "real time". I have one major constraint : the android device should be placed with a screen on the gravity axe (that means : vertical, like for taking pictures with the camera on the device) This prevents me from using azimut which is the most common measure to get a reference angle. It makes no sense to use : SensorManager

Intermittent loss of pose data in Leibniz release

旧时模样 提交于 2019-12-12 09:45:59
问题 I just updated my device to the latest (Leibniz) release and here are some observations/problems: 1) There are now prolonged (2-3s) intermittent periods in my App where the pose data is invalid. I assume the problem is in the driver, because the issue also occurs in the Tango Explorer. Just starting the Explorer and letting it sit there results in the "Motion Tracking Lost" dialog to pop in and out. Can anyone confirm this ? 2) The color buffer in the TangoService_connectOnFrameAvailable()

Marker based initial positioning with ARCore/ARKit?

最后都变了- 提交于 2019-12-12 08:10:34
问题 problem situation: Creating AR-Visualizations always at the same place (on a table) in a comfortable way. We don't want the customer to place the objects themselves like in countless ARCore/ARKit examples. I'm wondering if there is a way to implement those steps: Detect marker on the table Use the position of the marker as the initial position of the AR-Visualization and go on with SLAM-Tracking I know there is something like an Marker-Detection API included in the latest build of the

Project Tango Unity tutorials fail

痞子三分冷 提交于 2019-12-12 06:19:11
问题 I'm trying to walk through these two Project Tango Unity tutorials: https://developers.google.com/project-tango/apis/unity/unity-depth-perception https://developers.google.com/project-tango/apis/unity/unity-motion-tracking In each of them I get stopped in my tracks by what appears to be a lack of a Unity Plugin (I assume from the Tango SDK .unitypackage). I followed the instructions to import the downloaded Tango SDK .unitypackage but for some reason don't have to option to add a PointCloud