Skip to main content
API Reference

SwiftyNsdk


Type Aliases

NameTypeSummary
NetworkRequestId = ARDK_NetworkRequestIdARDK_NetworkRequestId
-
NsdkHandle = ARDK_HandleARDK_Handle
A type alias for the native NSDK handle used to interface with the underlying C API.
NsdkVpsAnchorId = StringString
A type alias for VPS anchor identifiers.
VPS anchors are identified by unique string identifiers that can be used
to track and manage anchors throughout their lifecycle.

Associated Types

NameTypeSummary
ConfigurationConfiguration
The type of configuration used by this session

Classes

NameTypeSummary
AssetResultAssetResult
Contains all the AssetInfo objects returned by a query to the Sites Manager service.
AwarenessImageResultAwarenessImageResult
Image-based awareness result containing a raw image.
AwarenessResultAwarenessResult
Base class for awarness results such as depth and segmentation.
Provides common properties like frame ID, timestamp, camera pose, and intrinsics.
BundlePlaybackDatasetLoaderBundlePlaybackDatasetLoader
A loader that retrieves playback dataset data from the app bundle.
This is the default implementation for loading datasets from Bundle.main.
Frame images and depth data are loaded on-demand when requested.
DataResourceOwnerDataResourceOwner
-
DepthResultDepthResult
Contains depth estimation results from the NSDK depth processing system.
## Overview
Depth results are generated by the depth processing system and include:
- Disparity maps for depth estimation. Unlike direct depth maps that provide distance
values in meters, disparity maps contain pixel offset values that represent relative
depth differences. These values must be converted to actual depth using camera
intrinsic parameters and baseline information.
- Camera pose and orientation information
- Camera intrinsic parameters for coordinate transformations
- Error status and metadata
MeshDataMeshData
Contains 3D mesh data for rendering and visualization.
MeshData provides access to 3D mesh geometry including vertices, indices,
normals, and texture coordinates.
## Overview
MeshData includes:
- Vertices: 3D position data for mesh geometry
- Indices: The triangles that make up the mesh
- Normals: Surface normal vectors for the vertices, commonly used for lighting calculations
(only available for live meshing)
- UVs: Texture coordinates for the vertices, used for mapping textures to the mesh
(only available for mesh downloader)
## Example Usage
let (status, meshData) = meshDownloader.downloadMesh(meshId: meshId)
if status.isOk() {
print("Mesh vertices: \(meshData.vertices.count)")
print("Mesh triangles: \(meshData.indices.count / 3)")
// Convert to SceneKit geometry for rendering
if let geometry = meshData.toSCNGeometry() {
let node = SCNNode(geometry: geometry)
sceneView.scene.rootNode.addChildNode(node)
}
}
## Memory Management
Mesh data is backed by native memory that is automatically managed.
The data remains valid as long as the MeshData instance exists.
final MeshDownloaderResultsMeshDownloaderResults
Contains the downloaded mesh geometry data for a VPS location.
This object holds an array of mesh results, where each result includes mesh geometry,
texture data (if requested), and the transform matrix that positions the mesh in world space.
final NsdkDepthSessionNsdkDepthSession
Depth feature session for NSDK. Provides control over depth sensing capabilities.
Upon starting the depth session, NSDK will begin processing AR frames to generate depth data.
The latest depth data can be retrieved using latestDepth(), and latestImageParams()
provides information to synchronize the depth image with camera frame.
final NsdkDeviceMappingSessionNsdkDeviceMappingSession
A session for creating VPS maps from AR data on the local device.
The device mapping feature provides capabilities for locally building persistent maps that
can be used for Visual Positioning System (VPS) localization. These maps capture the visual
features and spatial structure of an environment.
final NsdkMapStorageNsdkMapStorage
A storage system for managing device-generated maps.
The map storage feature provides capabilities for capturing, storing, and managing
map data from AR sessions. This data can be persisted and used for Visual Positioning
System (VPS) localization and map updates.
final NsdkMeshDownloaderNsdkMeshDownloader
A session-scoped utility for downloading mesh geometry associated with VPS locations.
final NsdkMeshingSessionNsdkMeshingSession
A session for real-time 3D mesh generation from AR camera frames.
The meshing feature provides capabilities for processing AR session data and generating
a triangle mesh representation of the physical environment in real-time. The mesh is
divided into chunks that can be individually queried and updated as the environment
is scanned.
final NsdkObjectDetectionSessionNsdkObjectDetectionSession
-
final NsdkRecordingExporterNsdkRecordingExporter
A session for exporting scan recordings to various formats.
NsdkRecordingExporter provides capabilities for converting saved scan data
into recorderV2 format for use in Unity Playback or activating VPS.
final NsdkScanningSessionNsdkScanningSession
A session for 3D scanning and visualization functionality.
The scanning feature provides capabilities for capturing, processing, and exporting
3D scan data from AR sessions. Scans of a location can be processed by the Visual
Positioning System's (VPS's) cloud services to enable VPS localization.
final NsdkSemanticsSessionNsdkSemanticsSession
A session for semantic segmentation and environmental understanding.
NsdkSemanticsSession provides capabilities for understanding the semantic structure
of the environment by classifying pixels into different object categories. This enables
applications to make intelligent decisions based on environmental context.
## Overview
Semantics features include:
- Real-time semantic segmentation of camera images
- Multiple semantic categories (sky, ground, buildings, people, etc.)
- Confidence maps for semantic classifications
- Packed channel data for efficient processing
- Suppression masks for filtering unwanted areas
## Usage Pattern
// Create and configure semantics session
let semanticsSession = nsdkSession.createSemanticsSession()
let config = Configuration()
semanticsSession.configure(with: config)
semanticsSession.start()
// Get available semantic channels
let (error, channelNames) = semanticsSession.getChannelNames()
if error == .none, let names = channelNames {
print("Available channels: \(names)")
}
// Get semantic confidence for a specific channel
let (status, result) = semanticsSession.getLatestConfidence(channelIndex: 0)
if status.isOk(), let semanticResult = result {
// Process semantic data
processSemanticData(semanticResult)
}
final NsdkSessionNsdkSession
The main entry point for the NSDK (Native SDK) framework.
NsdkSession provides the core functionality for AR applications, managing the lifecycle
of NSDK features and serving as a factory for specialized sessions like VPS, WPS, scanning, and mapping.
This class handles frame data processing, configuration management, and resource cleanup.
## Overview
Use NsdkSession to:
- Initialize the NSDK with your API key and configuration
- Send camera frame data for processing
- Create specialized feature sessions (VPS, WPS, Scanning, Mapping)
- Query required input data formats
- Manage the lifecycle of NSDK resources
## Example Usage
// Initialize with API key
let session = NsdkSession(apiKey: "your-api-key")
// Create a VPS session for localization
let vpsSession = session.createVpsSession()
// Send frame data during AR session
let status = session.sendFrame(frameData)
final NsdkSitesSessionNsdkSitesSession
-
@MainActor NsdkViewNsdkView
-
final NsdkVps2SessionNsdkVps2Session
-
final NsdkVpsCoverageSessionNsdkVpsCoverageSession
A session object for querying Visual Positioning System (VPS) coverage data related resources.
final NsdkVpsSessionNsdkVpsSession
A session for Visual Positioning System (VPS) functionality.
NsdkVpsSession provides capabilities for precise localization using visual features.
VPS can determine device position and orientation relative to a pre-mapped environment,
enabling persistent AR experiences that maintain accuracy across sessions.
## Example Usage
// 1. Create the VPS session from your NSDK session
let vpsSession = nsdkSession.createVpsSession()
// 2. Configure the session (must be done before starting)
let config = NsdkVpsSession.Configuration(
continuousLocalizationEnabled: true,
temporalFusionEnabled: true
)
try vpsSession.configure(with: config)
// 3. Start the session
vpsSession.start()
// 4. Track an anchor using a payload (from Geospatial Browser or previously created)
let anchorId = try vpsSession.trackAnchor(payload: anchorPayload)
// 5. Poll for anchor updates regularly (e.g., using a Timer)
// Check feature status to ensure VPS is working correctly
let featureStatus = vpsSession.featureStatus()
if !featureStatus.isOk() {
print("VPS Feature Status Error: \(featureStatus)")
}
// Get the latest anchor update
if let update = vpsSession.anchorUpdate(anchorId: anchorId) {
// 6. Check tracking state before placing content
switch update.trackingState {
case .notTracked:
print("Anchor not tracked - waiting for localization")
case .limited:
print("Limited tracking - pose may be unreliable")
case .tracked:
// Anchor is fully tracked - safe to place content
if let transform = update.anchorToLocalTransform {
// Update your AR content with the anchor's transform
anchorEntity.transform = Transform(matrix: transform)
}
}
}
// Optional: Create new anchors at specific poses (requires successful localization)
let newAnchorId = try vpsSession.createAnchor(at: cameraPose)
// Optional: Get payload for created anchor (only available when tracked)
if let payloadState = vpsSession.anchorPayload(anchorId: newAnchorId) {
switch payloadState {
case .inProgress:
print("Payload not yet available")
case .success(let payload):
print("Anchor payload: \(payload)")
// Store or share this payload for future sessions
}
}
// Optional: Get GPS coordinates from VPS localization
// (requires gpsCorrectionForContinuousLocalization enabled in config)
let result = vpsSession.devicePoseAsGeolocation(pose: cameraPose)
switch result {
case .success(let geolocation):
print("Lat: \(geolocation.latitude), Lon: \(geolocation.longitude)")
case .failure(let error):
print("Geolocation error: \(error)")
}
// Stop the session when done
vpsSession.stop()
final NsdkWpsSessionNsdkWpsSession
A session for World Positioning System (WPS) functionality.
WPS provides the 3D position and orientation of the device in geographic coordinates
as an alternative to using device GPS and compass heading data. WPS provides greater
accuracy and frame-to-frame stability than standard GPS positioning, making it more
suitable for AR applications. As the user moves around, WPS maintains the device's position,
making it suitable for continuous use over long periods of time and long distances. WPS will
work in any location where the phone has a GPS signal, but the accuracy will vary depending
on GPS accuracy.
final ObjectDetectionClassNamesBufferObjectDetectionClassNamesBuffer
A read-only container for the list of class names supported by the object detection model.
final ObjectDetectionImageParamsObjectDetectionImageParams
A read-only container for the image params of object detection
final ObjectDetectionMetadataObjectDetectionMetadata
A read-only container for the metadata of an object detection frame.
final ObjectDetectionResultObjectDetectionResult
A read-only container for the results of a single, successful object detection frame.
OrganizationResultOrganizationResult
Contains all the OrganizationInfo objects returned by a query to the Sites Manager service.
@MainActor PlaybackBackgroundRendererPlaybackBackgroundRenderer
-
PlaybackDatasetPlaybackDataset
A dataset loaded from a capture JSON file containing frame metadata.
This class uses on-demand loading for frame images and depth data.
Only the currently requested frame is loaded into memory, reducing memory pressure
for large datasets.
PlaybackDatasetLoaderPlaybackDatasetLoader
Base class for loading playback dataset data from various sources.
This class provides a base implementation that must be subclassed. Subclasses must override
loadCaptureJSON(), loadImage(imageName:), loadDepthData(depthFileName:), and
loadDepthConfidence(confidenceFileName:) to provide concrete implementations.
The loader uses on-demand loading - only the capture JSON is loaded upfront, and frame
images/depth data are loaded when requested by PlaybackDataset.
- Note: This class acts as an abstract base class. Do not instantiate directly.
PlaybackSessionPlaybackSession
A session that plays back frame metadata and images from a loaded dataset.
Playback runs on a background queue and continuously loops through frames at the
framerate specified in the dataset. The session notifies its delegate of each frame update.
@MainActor PlaybackViewPlaybackView
-
RaycastBufferRaycastBuffer
A read-only container for the raycast buffer information generated during scanning.
SemanticsResultSemanticsResult
Contains semantic segmentation results from the NSDK semantics processing system.
SemanticsResult provides semantic understanding of the environment by classifying
pixels in camera images into different object categories (e.g., sky, ground, buildings,
people, vehicles). This enables applications to understand the scene structure and
make intelligent decisions based on environmental context.
## Overview
Semantic segmentation results include:
- Confidence maps: Per-pixel confidence scores for semantic classifications
- Packed channels: Multiple semantic categories encoded in a single image
- Suppression masks: Masks indicating areas to be ignored or suppressed
- Metadata: Frame information, timestamps, and error status
## Example Usage
// Get confidence for a specific semantic channel
let (status, confidenceResult) = semanticsSession.getLatestConfidence(channelIndex: 0)
if status.isOk(), let result = confidenceResult {
print("Confidence image size: \(result.image?.width ?? 0) x \(result.image?.height ?? 0)")
print("Frame ID: \(result.frameId)")
print("Timestamp: \(result.timestampMs)")

// Process confidence data for semantic understanding
processSemanticConfidence(result)
}
// Get packed semantic channels
let (status, packedResult) = semanticsSession.getLatestPackedChannels()
if status.isOk(), let result = packedResult {
// Process packed semantic data
processPackedSemantics(result)
}
SiteResultSiteResult
Contains all the SiteInfo objects returned by a query to the Sites Manager service.
SitesResultSitesResult
Base class for results returned by queries to the Sites Manager service.
UserResultUserResult
Contains the user information returned by a query to the Sites Manager service.
VoxelBufferVoxelBuffer
A read-only container for the voxel buffer information generated during scanning.

Protocols

NameTypeSummary
NsdkFeatureSessionNsdkFeatureSession
A protocol that defines the common lifecycle and configuration interface for NSDK feature sessions.
NsdkLogCallback : AnyObject AnyObject
Protocol for receiving log messages from NSDK.
Implement this protocol to receive NSDK log messages in your application.
The callback will be invoked on background threads, so ensure your implementation
is thread-safe.
## Example Usage
class MyLogCallback: NsdkLogCallback {
func onLog(level: NsdkLogLevel, message: String, fileName: String?, fileLine: Int, funcName: String?) {
let levelStr = level.description
let location = fileName.map { "\($0):\(fileLine)" } ?? ""
let funcInfo = funcName.map { " \($0)" } ?? ""
print("[NSDK-\(levelStr)] \(location)\(funcInfo): \(message)")
}
}
let callback = MyLogCallback()
let session = NsdkSession(apiKey: "your-key", logCallback: callback)
NsdkSessionDelegate : ARSessionDelegate, PlaybackSessionDelegateARSessionDelegate, PlaybackSessionDelegate
-
PlaybackDatasetSourcePlaybackDatasetSource
Protocol for loading playback dataset data from various sources.
This protocol abstracts data retrieval, allowing for different implementations
such as bundle loading, file system loading, remote loading, or mock data for testing.
Implementations are used for on-demand frame loading - the loader is passed to
PlaybackDataset which calls these methods when frames are requested.
@MainActor PlaybackRendererPlaybackRenderer
-
PlaybackSessionDelegate : SendableSendable
Delegate protocol for receiving frame updates during playback.
ResourceOwner : AnyObject AnyObject
-

Structs

NameTypeSummary
AreaTargetAreaTarget
Contains a CoverageArea and its associated LocalizationTarget.
AreaTargetResultAreaTargetResult
-
ARUtilsARUtils
Utility functions for AR and device capability detection.
ARUtils provides helper methods for detecting device capabilities
and AR features that are relevant to NSDK functionality.
AssetInfoAssetInfo
Represents asset information from the Sites Manager service.
Maps to proto messages AssetRecord, AssetData, and AssetComputedValues.
AssetMeshDataAssetMeshData
Mesh-specific asset data.
Maps to proto message AssetMeshData.
AssetSplatDataAssetSplatData
Splat-specific asset data.
Maps to proto message AssetSplatData.
AssetVpsDataAssetVpsData
VPS-specific asset data.
Maps to proto message AssetVpsData.
AuthInfoAuthInfo
Authentication information containing token claims.
Contains parsed JWT claims including token string, expiration, user information, and other standard JWT fields.
AwarenessImageParamsAwarenessImageParams
-
CoverageAreaCoverageArea
Represents a geographic area where VPS localization is possible
CoverageAreaResultCoverageAreaResult
Contains all the CoverageArea objects returned by a query to the VPS Coverage service.
GeolocationDataGeolocationData
Struct representing geolocation data including latitude, longitude, altitude,
heading, and orientation.
HintImageResultHintImageResult
Image data returned by a query to a VPS hint image URL.
ImageMathImageMath
Provides affine transformation utilities for image processing.
All affine matrices returned by this class operate in normalized coordinates,
where image space is mapped to the [0, 1] range in both axes with origin at top-left.
LatLngLatLng
Struct representing a geographical coordinate with latitude and longitude in degrees.
LocalizationTargetLocalizationTarget
Represents a real-world point of interest that is a VPS localization target.
VPS localization is more likely to succeed when a localization target is in camera view.
LocalizationTargetResultLocalizationTargetResult
Contains all the LocalizationTarget objects returned by a query to the VPS Coverage service.
MapMetadataMapMetadata
Structure representing the metadata of a device map for visualization and processing.
MeshDownloaderResultMeshDownloaderResult
Represents a single mesh result with geometry, texture, and transform data.
MeshUpdateInfoMeshUpdateInfo
Information about mesh chunk updates from live meshing operations.
Returned by NsdkMeshingSession.updatedMeshInfos() to indicate which
mesh chunks have been modified or removed since the last update.
NsdkBufferNsdkBuffer
A buffer containing binary data for NSDK operations.
NsdkBuffer provides a safe wrapper around binary data buffers used by
various NSDK features. It handles memory management and provides convenient
access to buffer data.
## Overview
NSDK buffers are used for:
- Image data transfer
- Mesh data storage
- Configuration data
- Any binary data that needs to be passed between Swift and the native NSDK layer
## Example Usage
// Create buffer from Swift Data
let imageData = UIImage(named: "texture")?.pngData()
let buffer = NsdkBuffer(data: imageData!)
// Access buffer data
print("Buffer size: \(buffer.dataSize) bytes")
// Use buffer with NSDK APIs
let status = someArdkFunction(buffer: buffer)
## Memory Management
NsdkBuffer automatically manages memory allocation and deallocation.
When created from Swift Data, it maintains a reference to prevent premature deallocation.
NsdkFeatureStatusNsdkFeatureStatus
Status flags for NSDK features indicating their current operational state.
NsdkFeatureStatus is an option set that represents various status conditions
for NSDK features like VPS, WPS, scanning, and mapping. Multiple status flags
can be active simultaneously to provide detailed status information.
## Overview
Use this to monitor the health and state of NSDK features:
- Check for errors that need attention
- Monitor initialization progress
- Verify configuration and API key validity
- Ensure features are ready for operation
## Example Usage
let status = vpsSession.getFeatureStatus()
if status.contains(.badApiKey) {
print("Invalid API key - check your credentials")
} else if status.contains(.configurationFailed) {
print("Feature configuration failed")
} else if status.contains(.initializing) {
print("Feature is still initializing...")
} else if status == .ok {
print("Feature is ready and operational")
}
NsdkFrameDataNsdkFrameData
A complete frame of data captured from an AR session.
NsdkFrameData encapsulates all the sensor data, images, and tracking information
from a single AR frame. This includes camera images, depth data, device pose,
GPS location, compass heading, and camera intrinsics.
## Overview
Frame data is the primary input to NSDK for all AR processing tasks including:
- Visual positioning and localization
- 3D scanning and reconstruction
- Map building and tracking
- AR location positioning
## Example Usage
// In your ARSession delegate
func session(_ session: ARSession, didUpdate frame: ARFrame) {
let frameData = NsdkFrameData(
timestampMs: UInt64(frame.timestamp * 1000),
cameraImage: RawImage(from: frame.capturedImage),
depthImage: frame.sceneDepth?.depthMap.flatMap { RawImage(from: $0) },
// ... other data
)
nsdkSession.sendFrame(frameData)
}
NsdkInputDataFlagsNsdkInputDataFlags
Flags indicating which types of input data are required by NSDK.
NsdkInputDataFlags is an option set that specifies which data types
should be included in frames sent to NSDK. Use getRequestedDataInputs()
to determine which data is currently needed, then include only the
requested data types in your frame data for optimal performance.
## Overview
NSDK features dynamically request different types of input data based on:
- Which features are active (VPS, WPS, scanning, mapping)
- Current processing state and requirements
- Device capabilities and available sensors
## Example Usage
let requiredInputs = nsdkSession.getRequestedDataInputs()
var frameData = NsdkFrameData()
if requiredInputs.contains(.pose) {
frameData.cameraTransform = currentPose
}
if requiredInputs.contains(.cameraImage) {
frameData.cameraPlane0 = cameraPlane
}
if requiredInputs.contains(.platformDepth) {
frameData.depthData = depthBuffer
}
nsdkSession.sendFrame(frameData)
NsdkPathConfigNsdkPathConfig
-
NsdkPlaybackFrameNsdkPlaybackFrame
A frame of data from a playback session, bundling together frame metadata and associated buffers.
NsdkUtilsNsdkUtils
Utility functions for NSDK string management and memory handling.
NsdkUtils provides helper methods for safely managing C string conversions
and memory allocation when working with the NSDK C API.
## Overview
The utilities in this struct help manage the complexity of converting between
Swift strings and C strings while ensuring proper memory cleanup and avoiding
memory leaks.
OrganizationInfoOrganizationInfo
Represents organization information from the Sites Manager service.
RawImageRawImage
A raw image containing pixel data for NSDK operations.
RawImage provides access to image data in various formats (RGB, grayscale,
depth, etc.) used by NSDK features like depth processing, semantic segmentation,
and image analysis.
## Overview
Raw images are used throughout NSDK for:
- Camera frame processing
- Depth map representation
- Semantic segmentation results
- Image format conversions
- Computer vision operations
## Example Usage
let (status, depthResult) = depthSession.getDepth()
if status.isOk() {
let image = depthResult.image
print("Image size: \(image.width) x \(image.height)")
print("Image type: \(image.type)")

// Access pixel data
let pixelData = image.data
// Process pixel data based on image type
}
## Memory Management
This object's pointers are only valid when the NSDK objects that holds it are still in scope.
SemanticsChannelsSemanticsChannels
A set of semantic channels represented as a bitmask.
This OptionSet allows for efficient bitwise operations on channel sets.
SiteInfoSiteInfo
Represents site information from the Sites Manager service.
TextureUtilsTextureUtils
-
TimeoutErrorTimeoutError
-
UserInfoUserInfo
Represents user information from the Sites Manager service.
Vps2GeolocationDataVps2GeolocationData
GPS and heading data calculated by VPS2.
Vps2NetworkRequestRecordVps2NetworkRequestRecord
-
Vps2PoseVps2Pose
Pose in AR coordinate space calculated by VPS2 from a geolocation.
Vps2TransformerVps2Transformer
-
VpsAnchorUpdateVpsAnchorUpdate
Contains the latest tracking information for a VPS anchor.
Anchor updates are retrieved via getAnchorUpdate(anchorId:) and provide the
most current information about an anchor's position, orientation, and tracking status. This
is a snapshot of the anchor, and the latest anchor update should be used every frame.
WpsLocationWpsLocation
Contains world positioning data from the WPS (World Positioning System).
WpsLocation provides global positioning information that combines GPS/GNSS
data with visual positioning for enhanced accuracy and reliability.
## Overview
WPS location data includes:
- Reference GPS coordinates (latitude, longitude, altitude)
- Transformation matrix for coordinate conversions
- Status information about positioning quality
## Example Usage
let (status, location) = wpsSession.latestLocation()
if status.isOk() {
print("GPS Coordinates: \(location.referenceLatitudeDegrees), \(location.referenceLongitudeDegrees)")
print("Altitude: \(location.referenceAltitudeMetres) meters")
print("Status: \(location.status)")

// Use the transformation matrix for coordinate conversions
let worldPosition = location.trackingToRelativeEdn
placeARContent(at: worldPosition)
}

Enums

NameTypeSummary
AgeLevelAgeLevel
Codes describing the age level of the user.
This enum represents the age classification for users of the NSDK
AssetDeploymentTypeAssetDeploymentType
Asset deployment type.
Maps to proto enum AssetDeploymentType.
AssetPipelineJobStatusAssetPipelineJobStatus
Asset pipeline job status.
Maps to proto enum AssetPipelineJobStatus.
AssetStatusTypeAssetStatusType
Asset status.
Maps to proto enum AssetStatusType.
AssetTypeAssetType
Asset type - determines which typed asset data is present.
Maps to proto enum AssetType.
AwarenessErrorAwarenessError
-
ExportResolutionExportResolution
Resolution option for exported scan images.
When exporting a recording, this controls which image resolutions are included in the payload.
ImageTypeImageType
Enum representing various image types supported by NSDK.
NsdkAsyncState<Value, Error> where Error : ErrorError
Reports the state of an asynchronous NSDK operation.
NsdkErrorNsdkError
Errors thrown by the NSDK API.
NsdkError a subset of all the ARDK_Statuscodes returned by the C API,
containing just those that can occur in the Swift environment.
NsdkLogLevelNsdkLogLevel
Defines the available logging levels for NSDK.
NsdkLogLevel controls the verbosity of logging output from the NSDK system.
Logging can be configured separately for stdout, files, and callback functions.
## Overview
Log levels follow a hierarchical structure where higher levels include all messages
from lower levels. For example, setting the level to .warn will include warning,
error, and fatal messages, but exclude debug and info messages.
NsdkNetworkErrorNsdkNetworkError
Possible errors from network operations.
NsdkNetworkRequestStatusNsdkNetworkRequestStatus
Status of a network request.
PlaybackDatasetConstantsPlaybackDatasetConstants
Constants for playback dataset file names and extensions.
SemanticsChannelNameSemanticsChannelName
-
TypedAssetDataTypedAssetData
Discriminated union for typed asset data.
One of mesh, splat, or vps will be set based on the asset type.
Vps2NetworkRequestTypeVps2NetworkRequestType
-
Vps2TrackingStateVps2TrackingState
-
VpsGraphOperationErrorVpsGraphOperationError
-
WpsErrorWpsError
Represents the current error status of the WPS (World Positioning System) feature.

Methods

NameTypeSummary
configurevoid
-
featureStatusNsdkFeatureStatus
Gets the current status of the feature.
This method reports any errors or warnings that have occurred within the feature system.
Check this periodically to monitor the health of operations. Once an error is flagged,
it will remain flagged until the problematic process runs again and completes successfully.
- Returns: Feature status flags indicating current state and any issues
infoString
Returns details about the loader configuration.
- Returns: A string describing the loader configuration
loadCaptureJSONData?
Loads the capture JSON data from the data source.
- Returns: The JSON data as Data, or nil if loading fails
loadDepthConfidenceData?
Loads depth confidence data from the data source.
This method is called on-demand when confidence data is requested.
- Parameter confidenceFileName: The filename of the confidence data file (e.g., "confidence_00000000.bin")
- Returns: The confidence data as Data, or nil if loading fails or confidence data is not available
loadDepthDataData?
Loads depth data from the data source.
This method is called on-demand when depth data is requested.
- Parameter depthFileName: The filename of the depth data file (e.g., "depth_00000000.bin")
- Returns: The depth data as Data, or nil if loading fails or depth data is not available
loadImageCGImage?
Loads an image from the data source.
This method is called on-demand when a frame image is requested.
- Parameter imageName: The filename of the image (e.g., "frame_00000000.jpg")
- Returns: The image as a CGImage, or nil if loading fails
nsdkSessionvoid
This is called when a new frame has been sent to the underlying nsdk
@param session The nsdk session being run.
@param frame The nsdk frame that was just sent to the underlying native NSDK subsystems
onLogvoid
Called when NSDK generates a log message.
- Parameters:
- level: The severity level of the log message
- message: The log message content
- fileName: Optional source file name where the log was generated
- fileLine: Line number in the source file
- funcName: Optional function name where the log was generated
playbackSessionvoid
Called each time a new frame is ready during playback.
This method is called on the playback queue, so any UI updates should be
dispatched to the main queue.
- Parameters:
- session: The playback session that generated the update
- frame: The frame data including metadata, image, and depth buffers
@MainActor renderFramevoid
-
startvoid
Starts the feature session.
After starting, the session will begin processing incoming frame data according to
its configured behavior. The session must be configured before starting.
stopvoid
Stops the feature session.
This halts all processing. The session can be reconfigured and restarted after stopping.