feat: wifi survey support #1029

Open
opened 2026-03-28 04:30:58 +00:00 by mfreeman451 · 9 comments
Owner

Imported from GitHub.

Original GitHub issue: #2835
Original author: @mfreeman451
Original URL: https://github.com/carverauto/serviceradar/issues/2835
Original created: 2026-02-14T05:38:03Z


Is your feature request related to a problem?

Got an idea from @marvin-hansen to create a wifi survey app:

https://www.youtube.com/watch?v=8kxkFlnhYBs
https://youtu.be/8kxkFlnhYBs?t=633

Needs:

The app will be used to conduct wifi surveys in your house, not sure how all of the mapping would work, lidar on newer iphones? After the user is done taking the survey the data would be uploaded to SR for processing.

This is kind of a huge undertaking.

Prompt used in the video:

Write a single-file Swift (SwiftUI) iPhone app that maps my Wi-Fi environment in real time.

The app should repeatedly scan all visible Wi-Fi access points, use their RSSI signal strengths to estimate relative positions, and render a live, visually compelling 3D map of the “Wi-Fi space” around me.

How it should work

Scan visible APs repeatedly (every 1–2 seconds) from my iPhone’s current position.

Capture at minimum:

SSID (do not obfuscate; show real SSIDs)

BSSID (if available)

RSSI

Frequency band (2.4 GHz vs 5 GHz, if available)

Convert RSSI to estimated relative distance using the best available signal-to-distance methodology.

Because a single receiver position only yields star-topology distances (me → each AP), use an approach to infer which APs are likely near each other, and embed the entire set into 3D.

Visualization requirements (go all out)

Use a real 3D, interactive visualization inside the app (SceneKit / RealityKit preferred), not a 2D plot.

Each AP is a 3D node:

Sized by signal strength

Colored by band (2.4 GHz vs 5 GHz)

Labeled with its real SSID

Show estimated coverage as translucent spheres around each AP, scaled by estimated range/strength.

Show uncertainty via:

Sphere opacity and blur/fuzziness

Well-localized APs look sharp/solid; poorly constrained ones look diffuse/transparent

Animate convergence:

As more samples arrive, show positions refining and uncertainty shrinking in real time

Show my position at the origin as a distinct marker (and optionally a subtle reference grid/axes).

Aesthetic direction

Dark, high-contrast “peering into invisible RF space” look

Glowing nodes, soft bloom-like feel, subtle motion

Optional ASCII/Matrix-inspired vibe without sacrificing readability

Constraints / goal

The app should be self-contained: no trained models, no external data sources—just math, signal processing, and local rendering.

Target: run it, stand/sit in place for ~30 seconds, and watch a beautiful 3D map materialize that reveals the hidden geometry of my surrounding Wi-Fi environment.

Describe the solution you'd like

A clear and concise description of what you want to happen.

Describe alternatives you've considered

A clear and concise description of any alternative solutions or features you've considered.

Additional context

Add any other context or screenshots about the feature request here.

Imported from GitHub. Original GitHub issue: #2835 Original author: @mfreeman451 Original URL: https://github.com/carverauto/serviceradar/issues/2835 Original created: 2026-02-14T05:38:03Z --- **Is your feature request related to a problem?** Got an idea from @marvin-hansen to create a wifi survey app: https://www.youtube.com/watch?v=8kxkFlnhYBs https://youtu.be/8kxkFlnhYBs?t=633 Needs: - [x] gemini ultra sub - [x] #2878 - [ ] serviceradar (SR) API integration - [x] #2881 - [x] #2885 The app will be used to conduct wifi surveys in your house, not sure how all of the mapping would work, lidar on newer iphones? After the user is done taking the survey the data would be uploaded to SR for processing. This is kind of a huge undertaking. Prompt used in the video: ``` Write a single-file Swift (SwiftUI) iPhone app that maps my Wi-Fi environment in real time. The app should repeatedly scan all visible Wi-Fi access points, use their RSSI signal strengths to estimate relative positions, and render a live, visually compelling 3D map of the “Wi-Fi space” around me. How it should work Scan visible APs repeatedly (every 1–2 seconds) from my iPhone’s current position. Capture at minimum: SSID (do not obfuscate; show real SSIDs) BSSID (if available) RSSI Frequency band (2.4 GHz vs 5 GHz, if available) Convert RSSI to estimated relative distance using the best available signal-to-distance methodology. Because a single receiver position only yields star-topology distances (me → each AP), use an approach to infer which APs are likely near each other, and embed the entire set into 3D. Visualization requirements (go all out) Use a real 3D, interactive visualization inside the app (SceneKit / RealityKit preferred), not a 2D plot. Each AP is a 3D node: Sized by signal strength Colored by band (2.4 GHz vs 5 GHz) Labeled with its real SSID Show estimated coverage as translucent spheres around each AP, scaled by estimated range/strength. Show uncertainty via: Sphere opacity and blur/fuzziness Well-localized APs look sharp/solid; poorly constrained ones look diffuse/transparent Animate convergence: As more samples arrive, show positions refining and uncertainty shrinking in real time Show my position at the origin as a distinct marker (and optionally a subtle reference grid/axes). Aesthetic direction Dark, high-contrast “peering into invisible RF space” look Glowing nodes, soft bloom-like feel, subtle motion Optional ASCII/Matrix-inspired vibe without sacrificing readability Constraints / goal The app should be self-contained: no trained models, no external data sources—just math, signal processing, and local rendering. Target: run it, stand/sit in place for ~30 seconds, and watch a beautiful 3D map materialize that reveals the hidden geometry of my surrounding Wi-Fi environment. ``` **Describe the solution you'd like** A clear and concise description of what you want to happen. **Describe alternatives you've considered** A clear and concise description of any alternative solutions or features you've considered. **Additional context** Add any other context or screenshots about the feature request here.
Author
Owner

Imported GitHub comment.

Original author: @mfreeman451
Original URL: https://github.com/carverauto/serviceradar/issues/2835#issuecomment-3901216584
Original created: 2026-02-14T06:17:11Z


I'm guessing there might be some open building/layout format where you layout your building with some drawing tool, and then import it into your survey app?

Imported GitHub comment. Original author: @mfreeman451 Original URL: https://github.com/carverauto/serviceradar/issues/2835#issuecomment-3901216584 Original created: 2026-02-14T06:17:11Z --- I'm guessing there might be some open building/layout format where you layout your building with some drawing tool, and then import it into your survey app?
Author
Owner

Imported GitHub comment.

Original author: @marvin-hansen
Original URL: https://github.com/carverauto/serviceradar/issues/2835#issuecomment-3901224113
Original created: 2026-02-14T06:21:52Z


I would go the other way around and try to collect wifi networks from existing access points, feed them into SR, combine the findings with all the existing data in SR and render it. This is super cool because then you could also visualize optionally all devices connected via each WiFi network.

For your enterprise adopters, you can add:

  • a whitelist of known and allowed networks and AP. That way you can instantly discover unauthorized access points which are a massive security risk.

  • Combined LAN & Wifi Netflix e.g. literally from the iphone to the outbound router. That way you can certify that network policies are effective. Furthermore a real time 3D render of the data flow gives a stunning UI for your bored admin..

  • An optional lidar scan linked to the coordinates of the network gear paves the way for a digital twin of the actual network topology and therefore gives the opportunity to analysis for gaps. A big win for large and complex networks. I think that would go above and beyond even paid Networking solutions.

Imported GitHub comment. Original author: @marvin-hansen Original URL: https://github.com/carverauto/serviceradar/issues/2835#issuecomment-3901224113 Original created: 2026-02-14T06:21:52Z --- I would go the other way around and try to collect wifi networks from existing access points, feed them into SR, combine the findings with all the existing data in SR and render it. This is super cool because then you could also visualize optionally all devices connected via each WiFi network. For your enterprise adopters, you can add: - a whitelist of known and allowed networks and AP. That way you can instantly discover unauthorized access points which are a massive security risk. - Combined LAN & Wifi Netflix e.g. literally from the iphone to the outbound router. That way you can certify that network policies are effective. Furthermore a real time 3D render of the data flow gives a stunning UI for your bored admin.. - An optional lidar scan linked to the coordinates of the network gear paves the way for a digital twin of the actual network topology and therefore gives the opportunity to analysis for gaps. A big win for large and complex networks. I think that would go above and beyond even paid Networking solutions.
Author
Owner

Imported GitHub comment.

Original author: @marvin-hansen
Original URL: https://github.com/carverauto/serviceradar/issues/2835#issuecomment-3901233437
Original created: 2026-02-14T06:24:48Z


To clarify the premise, all Access points also listen for other networks and I think it should be possible to retrieve that list remotely granted you have secure access to the AP. Instead of walking around all company facilities to scan for Wifi, you query all existing APs for a list and then feed this into SR.

Imported GitHub comment. Original author: @marvin-hansen Original URL: https://github.com/carverauto/serviceradar/issues/2835#issuecomment-3901233437 Original created: 2026-02-14T06:24:48Z --- To clarify the premise, all Access points also listen for other networks and I think it should be possible to retrieve that list remotely granted you have secure access to the AP. Instead of walking around all company facilities to scan for Wifi, you query all existing APs for a list and then feed this into SR.
Author
Owner

Imported GitHub comment.

Original author: @mfreeman451
Original URL: https://github.com/carverauto/serviceradar/issues/2835#issuecomment-3901234471
Original created: 2026-02-14T06:25:12Z


This document outlines the requirements for the ServiceRadar: FieldSurvey (iOS) companion app and its deep integration into the ServiceRadar "God-View" Visualization Engine.


PRD: ServiceRadar FieldSurvey & Cyber-Physical Integration

1. Vision & Purpose

To provide the "Eyes on the Ground" for the ServiceRadar ecosystem. FieldSurvey transforms an iPhone/iPad into a high-fidelity cyber-physical scanner, mapping the invisible RF environment (Wi-Fi/BLE) directly onto a LiDAR-generated 3D floorplan. This data is "fused" with the ServiceRadar backbone to move the platform from abstract network mapping to a true Digital Twin.


2. The Integrated Architecture

2.1 The "Join" Engine (Logical to Physical)

The core value proposition is the automated matching of signals detected in the air to assets detected on the wire:

  • Logical Key: MAC Addresses/BSSIDs discovered by the ServiceRadar Backend (via WLC/SNMP).
  • Physical Key: BSSIDs and Management Radios (BLE) detected by the iOS App.
  • The Fusion: When a survey is uploaded, the backend performs a Spatial Join, pinning the logical node to the exact X/Y/Z coordinate in the USDZ room model.

2.2 The Data Vehicle (Mobile Arrow IPC)

Consistent with the ServiceRadar high-performance standard, the app does not send JSON.

  • Implementation: Survey samples (Timestamp, BSSID, RSSI, Frequency, Coordinates) are serialized into Apache Arrow RecordBatches on the device.
  • Result: Minimal battery drain and sub-millisecond ingestion into the Rust-based causality engine.

3. iOS App Requirements (FieldSurvey)

3.1 Physical Mapping (The Foundation)

  • RoomPlan Integration: Utilize the iOS RoomPlan API to generate a 3D wireframe of the environment (walls, doors, windows, furniture) in real-time.
  • LiDAR Localization: Use the LiDAR scanner to maintain absolute position within the room, ensuring RF samples are mapped to coordinates with <10cm variance.

3.2 RF Sensing (The Atmosphere)

  • Wi-Fi Scanning: Continuous polling of SSID, BSSID, RSSI, and Channel metadata.
  • Management Radio Detection (BLE): Scanning for OOB (Out-of-Band) management radios and asset beacons on enterprise switches.
  • Signal-to-Distance Modeling: Implementation of the Log-Distance Path Loss model to estimate the distance to APs behind physical obstructions.

3.3 Real-Time AR Visualization

  • RF Clouds: RealityKit-based translucent spheres representing Wi-Fi coverage.
  • Identity Spheres: AR overlays over non-RF equipment (Switches/Routers) driven by BLE discovery or manual "Pinning."
  • Crystallization UI: Nodes with low data confidence appear as "Foggy/Diffuse." As the user walks around the device to gather more angles, the node "Crystallizes" into a solid, sharp object.

4. Backend & God-View Integration

4.1 Rustler Spatial Processing

  • Coordinate Normalization: A Rustler NIF translates the iOS local coordinate system into the global Site Topology.
  • Multilateration Engine: The backend processes high-velocity RSSI samples from the app to triangulate the exact physical location of APs that are hidden in ceilings or walls.

4.2 God-View "Physical Layer" Enhancement

  • ** USDZ Mantle:** The "Mantle" layer of the web visualization is no longer abstract. For surveyed sites, the charcoal grid is replaced by the actual 3D USDZ Floorplan captured by the app.
  • Visual Overlay: Users can toggle "Physical View" to see logical traffic particles flowing through actual corridors and rooms.

5. Critical Causal Use Cases

5.1 Case: Physical Obstruction Analysis

  • Signal: ServiceRadar Discovery shows an AP with high packet retransmission.
  • Survey Data: The App's LiDAR detects a new metal partition or concrete pillar installed near the AP.
  • Result: The "Deep Causality" engine flags the Physical Obstruction as the root cause, rather than a switch configuration error.

5.2 Case: Rogue AP Localization

  • Signal: The App detects a BSSID not present in the ServiceRadar logical database.
  • Action: The God-View initiates a Red Radar Pulse in the exact 3D coordinate where the app detected the strongest signal.
  • Outcome: Security teams can walk directly to the physical desk/outlet where the rogue device is plugged in.

6. Technical Stack

  • Frontend (iOS): Swift, SwiftUI, ARKit, RealityKit, RoomPlan.
  • Data Serialization: Apache Arrow (Swift/Rust).
  • Backend: Elixir (Phoenix Channels), Rust (Rustler NIFs).
  • Storage: Apache AGE (Graph) + PostgreSQL (Spatial metadata).

7. Success Metrics

  1. Frictionless Setup: Mapping a 1,000 sq ft server room takes < 60 seconds.
  2. Asset Identification: 100% accuracy in matching BLE Management Radios to logical switch IDs.
  3. Data Efficiency: A complete 3D survey of a site occupies < 10MB via Arrow compression.
  4. Visual Parity: The 3D floorplan in the web-based God-View is a 1:1 replica of the iOS scan.

8. Aesthetic Specification ("Surveyor Nocturne")

  • Scanner View: Wireframe "Green Matrix" LiDAR mesh over real-world video.
  • Signal Spheres: High-intensity pulsing electric blue (5GHz) and electric orange (2.4GHz).
  • HUD: ASCII-style telemetry readouts in the corners of the AR view to maintain the "ServiceRadar" cyberpunk feel.
Imported GitHub comment. Original author: @mfreeman451 Original URL: https://github.com/carverauto/serviceradar/issues/2835#issuecomment-3901234471 Original created: 2026-02-14T06:25:12Z --- This document outlines the requirements for the **ServiceRadar: FieldSurvey (iOS)** companion app and its deep integration into the **ServiceRadar "God-View" Visualization Engine**. --- # PRD: ServiceRadar FieldSurvey & Cyber-Physical Integration ## 1. Vision & Purpose To provide the "Eyes on the Ground" for the ServiceRadar ecosystem. FieldSurvey transforms an iPhone/iPad into a high-fidelity cyber-physical scanner, mapping the invisible RF environment (Wi-Fi/BLE) directly onto a LiDAR-generated 3D floorplan. This data is "fused" with the ServiceRadar backbone to move the platform from abstract network mapping to a true **Digital Twin**. --- ## 2. The Integrated Architecture ### 2.1 The "Join" Engine (Logical to Physical) The core value proposition is the automated matching of signals detected in the air to assets detected on the wire: * **Logical Key:** MAC Addresses/BSSIDs discovered by the ServiceRadar Backend (via WLC/SNMP). * **Physical Key:** BSSIDs and Management Radios (BLE) detected by the iOS App. * **The Fusion:** When a survey is uploaded, the backend performs a **Spatial Join**, pinning the logical node to the exact X/Y/Z coordinate in the USDZ room model. ### 2.2 The Data Vehicle (Mobile Arrow IPC) Consistent with the ServiceRadar high-performance standard, the app does not send JSON. * **Implementation:** Survey samples (Timestamp, BSSID, RSSI, Frequency, Coordinates) are serialized into **Apache Arrow RecordBatches** on the device. * **Result:** Minimal battery drain and sub-millisecond ingestion into the Rust-based causality engine. --- ## 3. iOS App Requirements (FieldSurvey) ### 3.1 Physical Mapping (The Foundation) * **RoomPlan Integration:** Utilize the iOS RoomPlan API to generate a 3D wireframe of the environment (walls, doors, windows, furniture) in real-time. * **LiDAR Localization:** Use the LiDAR scanner to maintain absolute position within the room, ensuring RF samples are mapped to coordinates with $<10cm$ variance. ### 3.2 RF Sensing (The Atmosphere) * **Wi-Fi Scanning:** Continuous polling of SSID, BSSID, RSSI, and Channel metadata. * **Management Radio Detection (BLE):** Scanning for OOB (Out-of-Band) management radios and asset beacons on enterprise switches. * **Signal-to-Distance Modeling:** Implementation of the Log-Distance Path Loss model to estimate the distance to APs behind physical obstructions. ### 3.3 Real-Time AR Visualization * **RF Clouds:** RealityKit-based translucent spheres representing Wi-Fi coverage. * **Identity Spheres:** AR overlays over non-RF equipment (Switches/Routers) driven by BLE discovery or manual "Pinning." * **Crystallization UI:** Nodes with low data confidence appear as "Foggy/Diffuse." As the user walks around the device to gather more angles, the node "Crystallizes" into a solid, sharp object. --- ## 4. Backend & God-View Integration ### 4.1 Rustler Spatial Processing * **Coordinate Normalization:** A Rustler NIF translates the iOS local coordinate system into the global Site Topology. * **Multilateration Engine:** The backend processes high-velocity RSSI samples from the app to triangulate the exact physical location of APs that are hidden in ceilings or walls. ### 4.2 God-View "Physical Layer" Enhancement * ** USDZ Mantle:** The "Mantle" layer of the web visualization is no longer abstract. For surveyed sites, the charcoal grid is replaced by the actual **3D USDZ Floorplan** captured by the app. * **Visual Overlay:** Users can toggle "Physical View" to see logical traffic particles flowing through actual corridors and rooms. --- ## 5. Critical Causal Use Cases ### 5.1 Case: Physical Obstruction Analysis * **Signal:** ServiceRadar Discovery shows an AP with high packet retransmission. * **Survey Data:** The App's LiDAR detects a new metal partition or concrete pillar installed near the AP. * **Result:** The "Deep Causality" engine flags the **Physical Obstruction** as the root cause, rather than a switch configuration error. ### 5.2 Case: Rogue AP Localization * **Signal:** The App detects a BSSID not present in the ServiceRadar logical database. * **Action:** The God-View initiates a **Red Radar Pulse** in the exact 3D coordinate where the app detected the strongest signal. * **Outcome:** Security teams can walk directly to the physical desk/outlet where the rogue device is plugged in. --- ## 6. Technical Stack * **Frontend (iOS):** Swift, SwiftUI, ARKit, RealityKit, RoomPlan. * **Data Serialization:** Apache Arrow (Swift/Rust). * **Backend:** Elixir (Phoenix Channels), Rust (Rustler NIFs). * **Storage:** Apache AGE (Graph) + PostgreSQL (Spatial metadata). --- ## 7. Success Metrics 1. **Frictionless Setup:** Mapping a 1,000 sq ft server room takes $< 60$ seconds. 2. **Asset Identification:** 100% accuracy in matching BLE Management Radios to logical switch IDs. 3. **Data Efficiency:** A complete 3D survey of a site occupies $< 10MB$ via Arrow compression. 4. **Visual Parity:** The 3D floorplan in the web-based God-View is a 1:1 replica of the iOS scan. --- ## 8. Aesthetic Specification ("Surveyor Nocturne") * **Scanner View:** Wireframe "Green Matrix" LiDAR mesh over real-world video. * **Signal Spheres:** High-intensity pulsing electric blue (5GHz) and electric orange (2.4GHz). * **HUD:** ASCII-style telemetry readouts in the corners of the AR view to maintain the "ServiceRadar" cyberpunk feel.
Author
Owner

Imported GitHub comment.

Original author: @marvin-hansen
Original URL: https://github.com/carverauto/serviceradar/issues/2835#issuecomment-3901243627
Original created: 2026-02-14T06:29:21Z


Maybe combine the iOS App with SR as synergy e.g. scanning via iPhone in case Acces Points cant be queried?
Or adding additional information e.g. signal strength relative to position and to map out blind spots.

Imported GitHub comment. Original author: @marvin-hansen Original URL: https://github.com/carverauto/serviceradar/issues/2835#issuecomment-3901243627 Original created: 2026-02-14T06:29:21Z --- Maybe combine the iOS App with SR as synergy e.g. scanning via iPhone in case Acces Points cant be queried? Or adding additional information e.g. signal strength relative to position and to map out blind spots.
Author
Owner

Imported GitHub comment.

Original author: @mfreeman451
Original URL: https://github.com/carverauto/serviceradar/issues/2835#issuecomment-3902772430
Original created: 2026-02-14T23:21:00Z


https://deck.gl/examples/hexagon-layer could be an interesting visualization to show wifi signal strengths in on a 2d map, with the 3d hexagon layer overlay

Imported GitHub comment. Original author: @mfreeman451 Original URL: https://github.com/carverauto/serviceradar/issues/2835#issuecomment-3902772430 Original created: 2026-02-14T23:21:00Z --- https://deck.gl/examples/hexagon-layer could be an interesting visualization to show wifi signal strengths in on a 2d map, with the 3d hexagon layer overlay
Author
Owner

Imported GitHub comment.

Original author: @mfreeman451
Original URL: https://github.com/carverauto/serviceradar/issues/2835#issuecomment-3902778684
Original created: 2026-02-14T23:27:55Z


https://deck.gl/examples/point-cloud-layer -- mind is blown.

Imported GitHub comment. Original author: @mfreeman451 Original URL: https://github.com/carverauto/serviceradar/issues/2835#issuecomment-3902778684 Original created: 2026-02-14T23:27:55Z --- https://deck.gl/examples/point-cloud-layer -- mind is blown.
Author
Owner

Imported GitHub comment.

Original author: @mfreeman451
Original URL: https://github.com/carverauto/serviceradar/issues/2835#issuecomment-3938256115
Original created: 2026-02-21T06:10:42Z


Embedding Vectorization (pgvector) - RF Fingerprinting

GPS is useless indoors. If you want to track where a user is in a building based solely on their Wi-Fi environment, you can treat a moment in time as a vector.
If an iPhone hears: [AP1: -40dB, AP2: -65dB, AP3: -80dB], that is a vector [-40, -65, -80].
If we store these vectors using pgvector, we can use K-Nearest Neighbors (KNN) to calculate a user's real-time physical location by matching their current "RF Vector" against the mapped vectors in the database

Imported GitHub comment. Original author: @mfreeman451 Original URL: https://github.com/carverauto/serviceradar/issues/2835#issuecomment-3938256115 Original created: 2026-02-21T06:10:42Z --- # Embedding Vectorization (pgvector) - RF Fingerprinting GPS is useless indoors. If you want to track where a user is in a building based solely on their Wi-Fi environment, you can treat a moment in time as a vector. If an iPhone hears: [AP1: -40dB, AP2: -65dB, AP3: -80dB], that is a vector [-40, -65, -80]. If we store these vectors using pgvector, we can use K-Nearest Neighbors (KNN) to calculate a user's real-time physical location by matching their current "RF Vector" against the mapped vectors in the database
Author
Owner

Imported GitHub comment.

Original author: @mfreeman451
Original URL: https://github.com/carverauto/serviceradar/issues/2835#issuecomment-3964260732
Original created: 2026-02-26T05:46:41Z


idea to integrate spatial data in topo view and transition from topo to spatial if a given cluster or node has spatial data available https://deck.gl/docs/api-reference/widgets/splitter-widget

Imported GitHub comment. Original author: @mfreeman451 Original URL: https://github.com/carverauto/serviceradar/issues/2835#issuecomment-3964260732 Original created: 2026-02-26T05:46:41Z --- idea to integrate spatial data in topo view and transition from topo to spatial if a given cluster or node has spatial data available https://deck.gl/docs/api-reference/widgets/splitter-widget
Sign in to join this conversation.
No milestone
No project
No assignees
1 participant
Notifications
Due date
The due date is invalid or out of range. Please use the format "yyyy-mm-dd".

No due date set.

Dependencies

No dependencies set.

Reference
carverauto/serviceradar#1029
No description provided.