unity arkit_适用于ARKit 2的Unity ARKit插件的新增功能

unity arkit_适用于ARKit 2的Unity ARKit插件的新增功能_第1张图片

unity arkit

Apple announced exciting news for AR developers last week at WWDC, including ARKit 2. Unity has worked closely with Apple to allow our developers instant access to all of these new features with an update to Unity ARKit Plugin. In this blog, we’ll get into the technical details of ARKit 2’s new features and how to access them via the Unity ARKit Plugin. Please download the plugin from Bitbucket to follow along.

苹果上周在WWDC上为AR开发人员宣布了令人振奋的消息,其中包括ARKit2。Unity与Apple紧密合作,允许我们的开发人员通过Unity ARKit Plugin的更新立即访问所有这些新功能。 在此博客中,我们将介绍ARKit 2的新功能的技术细节以及如何通过Unity ARKit插件访问它们。 请从Bitbucket下载插件,然后继续。

ARWorldMap (ARWorldMap)

ARWorldMap is a useful new feature for ARKit 2, and allows for both persistence of AR experiences as well as shared multiplayer AR experiences. (Read more on ARWorldMap here.)

ARWorldMap是ARKit 2的一项有用的新功能,它既可以实现AR体验的持久性,也可以共享多人AR体验。 (在此处阅读有关ARWorldMap的更多信息 。)

See example in plugin: Examples/ARKit2.0/UnityARWorldMap/UnityARWorldMap.unity

请参阅插件中的 示例 Examples / ARKit2.0 / UnityARWorldMap / UnityARWorldMap.unity

Every session builds up an ARWorldMap as you move around and detect more feature points. You can get the current ARWorldMap from a session from C# and save it to somewhere on your Application.persisentDataPath.

当您四处走动并检测更多特征点时,每个会话都会构建一个ARWorldMap。 您可以从C#的会话中获取当前的ARWorldMap,并将其保存到Application.persisentDataPath上的某个位置。

You may also load a saved ARWorldMap from where you saved it. This allows virtual objects to persist in the same coordinate space even if you leave a session and come back to it later.

您也可以从保存它的位置加载保存的ARWorldMap。 这使虚拟对象可以保留在相同的坐标空间中,即使您离开会话并稍后再返回。

ARWorldMap can be serialized to a byte array and sent across to another device using WiFi, Bluetooth or some other means of sharing. It can also be deserialized on the other side and used to relocalize the other device to the same world mapping as the first device, so that you can have a shared multiplayer experience.

可以将ARWorldMap序列化为字节数组,然后使用WiFi,蓝牙或其他共享方式将其发送到另一台设备。 它也可以在另一端反序列化,并用于将另一台设备重新定位到与第一台设备相同的世界地图,从而可以共享多人游戏体验。

Once you have the ARWorldMap, either from loading it, or from memory, or from receiving it from another device, your device can share coordinate systems with that ARWorldMap by setting it as a parameter in the configuration and resetting the ARSession with that configuration.

一旦有了ARWorldMap,无论是从加载,从内存中还是从其他设备接收到它,您的设备都可以通过将该ARWorldMap设置为配置中的参数,并使用该配置重置ARSession来与该ARWorldMap共享坐标系。

This resets the session, and as you move around, it tries to match up the feature points in the ARWorldMap to the feature points it’s detecting in your environment. When they match up, it relocalizes your device coordinates to match up with the coordinates that were saved in the ARWorldMap.

这将重置会话,并在您四处移动时尝试将ARWorldMap中的特征点与环境中检测到的特征点进行匹配。 当它们匹配时,它将重新定位设备坐标,使其与保存在ARWorldMap中的坐标匹配。

Here is a video of a more complete example called SharedSpheres which uses Unity’s Multiplayer Networking feature UNet to transmit the ARWorldMap to another device to sync up their coordinate systems:

这是一个称为SharedSpheres的更完整示例的视频,该示例使用Unity的多人网络功能UNet将ARWorldMap传输到另一台设备以同步其坐标系:

演示地址

ARReferenceObject和ARObjectAnchor (ARReferenceObject and ARObjectAnchor)

Similar to ARReferenceImage and ARImageAnchor for image recognition that existed in ARKit 1.5, we now have ARReferenceObject and ARObjectAnchor to do object recognition. This was the feature prominently showed off by Lego in the WWDC keynote to recognize their real playset and enhance it with a virtual playset overlayed on top of the real one.

与ARKit 1.5中存在的用于图像识别的ARReferenceImage和ARImageAnchor相似,我们现在有了ARReferenceObject和ARObjectAnchor来进行对象识别。 乐高在WWDC主题演讲中突出展示了此功能,以识别他们的真实游戏环境并通过覆盖真实游戏环境的虚拟游戏环境对其进行增强。

UnityARObjectAnchor示例 (UnityARObjectAnchor example)

How to use it in Unity: See the Examples/ARKit2.0/UnityARObjectAnchor/UnityARObjectAnchor.unity scene in the plugin.

如何在Unity中使用它:请参阅插件中的Examples / ARKit2.0 / UnityARObjectAnchor / UnityARObjectAnchor.unity场景。

This example imagines that you already have .arobject files that describe the objects you want to recognize. You may create .arobject files to use here either from the UnityObjectScanner example described below, or from Apple’s ARKit Object Scanner app which both produce the same format of file for use in the workflow.

本示例假设您已经具有描述您要识别的对象的.arobject文件。 您可以通过下面描述的UnityObjectScanner示例或Apple的ARKit Object Scanner应用程序创建.arobject文件以在此处使用,这两种文件均会产生在工作流中使用的相同文件格式。

Again very similar to ARReferenceImage, we’re going to set up a ARReferenceObjectsSetAsset, which contains references to ARReferenceObjectAssets. Then we will add a reference to that ARReferenceObjectsSetAsset to the config for ARSession so that it tries to detect the ARReferenceObjects in that set when in the session.

再次与ARReferenceImage非常相似,我们将设置一个ARReferenceObjectsSetAsset,其中包含对ARReferenceObjectAssets的引用。 然后,我们将对ARReferenceObjectsSetAsset的引用添加到ARSession的配置中,以便在会话中尝试检测该集合中的ARReferenceObjects。

All this can be done in the Unity Editor.

所有这些都可以在Unity编辑器中完成。

Whenever an object is recognized, an ARObjectAnchor is created, and just like for other anchors, you can subscribe to an event that tells you when these anchors are added, updated or removed.

只要识别出对象,就会创建一个ARObjectAnchor,就像其他锚一样,您可以订阅一个事件,该事件告诉您何时添加,更新或删除这些锚。

When that event is triggered, you can decide what you want to do (e.g create a prefab at that location).

触发该事件后,您可以决定要执行的操作(例如,在该位置创建一个预制件)。

演示地址

UnityObjectScanner示例 (UnityObjectScanner example)

Examples/ARKit2.0/UnityARObjectScanner/UnityARObjectScanner.unity is a more complete example that does both object creation using a pickable bounding box as well as object detection.

示例/ARKit2.0/UnityARObjectScanner/UnityARObjectScanner.unity是一个更完整的示例,它使用可拾取的边界框创建对象以及检测对象。

You can save .arobjects that you have scanned using this example and then use iTunes FileSharing to transfer them to your Mac. Once you have the files on the Mac, you can rename them before you put them into your Unity project.

您可以使用本示例保存已扫描的.arobject,然后使用iTunes FileSharing将它们传输到Mac。 在Mac上拥有文件后,可以将它们重命名,然后再将它们放入Unity项目。

This example has different modes: the scanning mode and the detecting mode. For the scanning mode, we use an ARKitObjectScanningSessionConfiguration, which does a more detailed exploration of the scene, but which uses more CPU and power (so it should be limited in use).

本示例具有不同的模式:扫描模式和检测模式。 对于扫描模式,我们使用ARKitObjectScanningSessionConfiguration,它对场景进行了更详细的探索,但是使用了更多的CPU和功能(因此应限制使用)。

Using this configuration, you can tap on a plane near the object that you want to scan to produce a red bounding box to cover it. You can manipulate the box so that it just covers the object of interest. Then scan all around the box to get as many feature points on the object as possible. Then you create an ARReferenceObject by tapping a button. The created object gets saved to a list.

使用此配置,您可以点击要扫描的对象附近的平面,以产生一个红色的边框来覆盖它。 您可以操纵盒子,使其仅覆盖感兴趣的对象。 然后在方框中四处扫描以获取对象上尽可能多的特征点。 然后,通过点击一个按钮创建一个ARReferenceObject。 创建的对象将保存到列表中。

演示地址

Pressing the Detect button takes you to the detecting mode which works like the object anchor example above, but uses a method for dynamically adding the ARReferenceObjects to the list of detected objects.

按下“检测”按钮将带您进入检测模式,其工作方式与上面的对象锚点示例相同,但是使用了一种将ARReferenceObjects动态添加到检测到的对象列表的方法。

Pressing the Save button saves all the objects that have been scanned so far on to a folder on the device. It saves them as .arobjects and you can use iTunes FileSharing to transfer them to your Mac. Once you have the files on the Mac, you can rename them before you put them into your Unity project.

按下保存按钮会将到目前为止已扫描的所有对象保存到设备上的文件夹中。 它将它们另存为.arobjects,您可以使用iTunes FileSharing将它们传输到Mac。 在Mac上拥有文件后,可以将它们重命名,然后再将它们放入Unity项目。

AREnvironmentProbeAnchor (AREnvironmentProbeAnchor)

AREnvironmentProbeAnchor is a new kind of anchor that can either be generated automatically or you can specify where to create it. This anchor creates and updates a reflected environment map of the area around it based on the ARKit video frames and world tracking data. It also uses a machine learning algorithm to approximate the environment texture for parts of the scene it has not seen yet, based on an ML training model involving thousands of environments.

AREnvironmentProbeAnchor是一种新的锚,可以自动生成,也可以指定在何处创建。 该锚根据ARKit视频帧和世界跟踪数据创建并更新其周围区域的反射环境图。 它还基于涉及数千个环境的ML训练模型,使用机器学习算法来估计尚未看到的场景部分的环境纹理。

How to use it in Unity: See the Examples/ARKit2.0/UnityAREnvironmentTexture folder for examples

如何在Unity中使用它:有关示例,请参见Examples / ARKit2.0 / UnityAREnvironmentTexture文件夹

There is a new parameter on the session configuration that controls this feature, and that can have one of three values: UnityAREnvironmentTexturingNone, UnityAREnvironmentTexturingManual or UnityAREnvironmentTexturingAutomatic.

会话配置上有一个新参数来控制此功能,该参数可以具有以下三个值之一:UnityAREnvironmentTexturingNone,UnityAREnvironmentTexturingManual或UnityAREnvironmentTexturingAutomatic。

With the UnityAREnvironmentTexturingManual mode, you will have to create an AREnvironmentProbeAnchor yourself, but ARKit will update the texture that is captured from the environment.

使用UnityAREnvironmentTexturingManual模式,您必须自己创建一个AREnvironmentProbeAnchor,但是ARKit将更新从环境中捕获的纹理。

If you use UnityAREnvironmentTexturingAutomatic mode instead, ARKit will generate the AREnvironmentProbeAnchors in procedurally spaced intervals according to the data it infers from your session and your movement through the space.

如果改为使用UnityAREnvironmentTexturingAutomatic模式,则ARKit会根据从会话中推断出的数据以及您在空间中的移动,以程序间隔的间隔生成AREnvironmentProbeAnchors。

Both of the examples generate a prefab that contains a Unity ReflectionProbe component and updates it with the environment texture from the AREnvironmentProbeAnchor. This ReflectionProbe now participates in the standard Unity rendering pipeline and will enhance any GameObject that uses it.

这两个示例均生成一个包含Unity 反射组件的预制件,并使用AREnvironmentProbeAnchor中的环境纹理对其进行更新。 现在,此ReflectionProbe参与了标准的Unity渲染管道,并将增强所有使用它的GameObject。

演示地址

影像追踪 (Image tracking)

Reference images work the same as in ARKit 1.5, but now instead of just recognizing images, ARKit allows you to track them: when you move the reference image, the ARImageAnchor associated with them moves with the image so you can move content that is anchored on those moving images.

参考图像的工作方式与ARKit 1.5中的相同,但是现在ARKit不仅可以识别图像,还可以跟踪它们:移动参考图像时,与它们关联的ARImageAnchor随图像一起移动,因此您可以移动锚定在其上的内容那些运动图像。

There is one extra parameter on the session configuration that allows you to do this by stating how many images you want to track simultaneously during the session. The existing example has been updated to use this new feature.

会话配置上有一个额外的参数,您可以通过指定要在会话期间同时跟踪多少个图像来执行此操作。 现有示例已更新为使用此新功能。

演示地址

人脸跟踪改进 (Face Tracking improvements)

ARKit 2 also improves face tracking for iPhone X with some new features. First, there is one extra blendshape coefficient called TongueOut. This returns a value between 0.0 and 1.0 depending on how much you have stuck your tongue out as perceived by ARKit 2 face tracking. Apple showed this at WWDC on their Animojis, and it appeared to be very popular with the audience.

ARKit 2还通过一些新功能改善了iPhone X的面部跟踪。 首先,有一个额外的blendshape 系数称为TongueOut 。 返回的值介于0.0到1.0之间,具体取决于ARKit 2面部跟踪所感知到的伸出舌头的程度。 苹果公司在WWDC的Animojis上展示了这一点,并且在观众中似乎很受欢迎。

The other improvement is that it now does eye gaze tracking. You receive a transform that describes where each eye on the face is pointed at, as well as the position of the object which is being looked at.

另一个改进是它现在可以进行视线跟踪 。 您将收到一个变换,该变换描述了脸部每只眼睛所指向的位置以及所要观察对象的位置。

Take a look at Examples/ARKit2.0/UnityTongueAndEyes/UnityTongueAndEyes.unity for an example of how to make use of this new data coming in from the face anchors.

请看Examples / ARKit2.0 / UnityTongueAndEyes / UnityTongueAndEyes.unity ,以获取有关如何使用来自面部锚点的新数据的示例。

We’ll leave you with this creepy image of the author demonstrating this example:

我们将为您展示这个示例的作者的令人毛骨悚然的图像:

https://www.youtube.com/edit?o=U&video_id=ZBw_f_my5-M

https://www.youtube.com/edit?o=U&video_id=ZBw_f_my5-M

演示地址

制作一些出色的AR应用程序! (Make some great AR apps!)

For more technical details on the features above and how to use them from Unity, take a look at What’s New In Unity ARKit Plugin for ARKit 2. Please download the latest version of the plugin from Bitbucket and try building the examples to your iOS devices. Come to the forums with your questions. Then take the next step and create your own amazing AR experiences with the easy to use tools that we have provided. Most of all, have fun doing it!

有关上述功能以及如何在Unity中使用它们的更多技术细节,请查看Unity的ARKit ARKit 2的新功能 。 请从Bitbucket下载最新版本的插件,然后尝试将示例构建到您的iOS设备上。 带着您的问题来论坛 。 然后,继续下一步,并使用我们提供的易于使用的工具创建您自己的惊人的AR体验。 最重要的是,乐在其中!

翻译自: https://blogs.unity3d.com/2018/06/14/whats-new-in-unity-arkit-plugin-for-arkit-2/

unity arkit

你可能感兴趣的:(游戏,python,java,深度学习,机器学习)