(来自官网的 https://developer.vuforia.com/resources/dev-guide/step-2-compiling-simple-project)
OR 或者
When creating the Unity project, avoid using spaces in the name if targeting iOS because this causes problems later when you build with Xcode. Note that by double-clicking on a package, you can selectively import parts of that package.
当创建的工程目标平台是iOS时,工程名中不要包含空格字符,因为之后工程转到Xcode中会出问题。
Note: If you do not see the Vuforia package in the list, go back to Step 1 and manually install the packages.
Next, you need to add a Device Database to your project. You can do this in two ways:
接下来,你需要给你的工程添加一个Device Database。你有两个方法做到这点:
OR 或
If you are copying the Device Database files from another project, be sure to copy any files located in the Editor/QCAR/ImageTargetTextures folder. These will be used to texture the target plane in the Unity editor.
如果你用从其它工程复制Device Database文件方式的话,请确保复制Editor/QCAR/ImageTargetTextures文件夹下所有的文件。
You should now see the following folder structure inside Unity:
这时,你在Unity中应该能看到如下图显示一个目录结构:
Following is a summary of the folder contents:
下面总结了文件夹的内容:
Inspector view of the ARCamera 上图是一个ARCamera的检视器的视图
Inspector view of the ImageTarget
Note: When you added the Image Target object to your scene, a gray plane object appeared. This object is a placeholder for actual Image Targets. In the inspector view of the Image Target there is a pop-up list called Image Target. From this list, you can choose any Image Target that has been defined in one of theStreamingAssets/QCAR datasets so that the Image Target object in your scene adopts the size and shape from the Image Target it represents. The object is also textured with the same image from which the Image Target was created.
Now you can bind 3D content to your Image Target.
The Default Trackable Event Handler (DefaultTrackableEventHandler) is a script component of the Image Target that causes the cube you just drew to appear or disappear automatically – an automatic reaction to the appearance of the target in the video.
You can override this default behavior – one could also imagine playing a fade-out animation, showing an info screen or playing a sound for example. For a more detailed description of the ITrackableEventHandler interface, please see 'Responding to Tracking Events' in the Special Options section.
The Vuforia SDK has the ability to use multiple active Device Databases simultaneously. To demonstrate this capability, you can borrow the StonesAndChips and Tarmac Device Databases from the ImageTargets sample and configure both to load and activate in the ARCamera’s Inspector panel. This allows you to use targets from both Device Databases at the same time in your Unity scene.
The next step is to deploy your application to a supported device.
Unity provides a number of settings when building for Android devices – select from the menu (File > Build Settings… > Player Settings…) to see the current settings. Also, choose your platform now – Android or iOS.
You can now build the application. Attach your Android device and then click Build And Run to initialize the deployment process.
Unity provides a number of settings when building for iOS devices (File > Build Settings > Platform > iOS icon).
When building and running apps for iOS, Unity generates an Xcode project. It launches Xcode and loads this project. The Vuforia AR Extension includes a PostProcessBuildPlayer script that performs the task of integrating the Vuforia library into the generated Xcode project. This is run automatically when you select Build from within Unity. Be aware that if you manually change the generated Xcode project, you may need to update thePostProcessBuildPlayer script to avoid overwriting your changes.
The generated Xcode project includes a file called AppController.mm. There are Unity provided options in this file to tailor the performance of the app for your own purpose. The PostProcessBuildPlayer script sets theTHREAD_BASED_LOOP as a default because it gives the best visible performance with the samples provided alongside the Vuforia AR Extension. Consider changing these options to whatever gives the best performance for your own application.
Created AR scene
You should have a printout of the appropriate Image Target in front of you. If you are working with a target from one of the sample apps, the PDFs are located at Editor/QCAR/ForPrint/*.pdf. Otherwise, print out the image that you uploaded to the Target Manager and make sure that the aspect ratio doesn’t change. When you look at the target using the device camera, you should see your cube object bound to the target. Congratulations, you have successfully augmented reality!
The Vuforia Unity Extension supports the Play Mode feature, which provides AR application emulation through the Unity Pro Editor using a webcam. Configure this feature through the Web Cam Behaviour component of the ARCamera in the Inspector.
To use Play Mode for Vuforia in Unity Pro, simply select the attached, or built-in, webcam that you want to use from the Camera Device menu, and activate Play Mode using the Play button at the top of the Editor UI.
You can also use the standard Unity Play Mode with non-Pro Unity versions and by set ‘Don’t use for Play Mode’ in the Web Cam Behaviour component.
To use standard Play Mode, adjust the transform of the ARCamera object to get your entire scene in view, and then run the application in the Unity editor. There is no live camera image or tracking in standard Play Mode; instead, all Targets are assumed to be visible. This allows you to test the non-AR components of your application, such as scripts and animations, without having to deploy to the device each time.