Skip to main content

Device App Test Report

The Device App Test Report API allows mobile applications and network monitoring tools to submit network performance test results to the GATE system. This endpoint enables the collection of comprehensive network metrics including download/upload speeds, latency, jitter, and DNS response times across different network types like WiFi, Mobile, Bluetooth, and Ethernet.

Base URL: https://gate.zequenze.com/api/v1

Authentication: All endpoints require a Bearer token:

Authorization: Bearer <your-api-token>

Overview

The Device App Test Report API is designed for applications that perform network testing and need to centrally report their findings to the GATE monitoring system. This API is particularly useful for:

  • Mobile applications conducting periodic network performance tests
  • IoT devices reporting connectivity metrics
  • Network monitoring tools submitting automated test results
  • Quality assurance systems tracking network performance over time

The endpoint accepts detailed test reports including the network context (type, name), test methodology (download, upload, latency, etc.), target destinations, and measured values with appropriate units. This data can then be analyzed for network performance trends, troubleshooting, and optimization efforts.

The API uses standardized abbreviations for network types and test types to ensure consistent data collection across different reporting applications and devices.


Endpoints

POST /device_app_test_report/

Description: Submits a network performance test report to the GATE system. This endpoint allows applications to register the results of network tests they have performed, including speed tests, latency measurements, and connectivity checks. The submitted data helps build a comprehensive picture of network performance across different environments and connection types.

Use Cases:

  • Mobile app reporting WiFi speed test results in different locations
  • IoT device submitting periodic connectivity health checks
  • Network monitoring tool reporting automated performance measurements
  • Quality assurance system logging test results for compliance reporting

Full URL Example:

https://gate.zequenze.com/api/v1/device_app_test_report/

Parameters:

Parameter Type In Required Description
data string body Yes JSON string containing the complete test report data

Request Body Schema:

Field Type Required Description Allowed Values
network_name string No Name of the current network (e.g., "Office WiFi", "Verizon LTE") Any string
network_type string Yes Type of network connection used for the test wi, mo, bl, et
test_type string Yes Type of performance test that was conducted dl, ul, de, jt, dn
destination string Yes Target endpoint for the test (URL, hostname, or IP address) Any valid URL/hostname/IP
value number Yes Measured test result as a numeric value Any positive number
count integer No Number of test iterations performed (default: 1) Any positive integer
unit string No Unit of measurement for the test value Kbps, KBps, ms

Network Type Codes:

  • wi = WiFi
  • mo = Mobile (cellular)
  • bl = Bluetooth
  • et = Ethernet

Test Type Codes:

  • dl = Download speed test
  • ul = Upload speed test
  • de = Delay/latency test
  • jt = Jitter measurement
  • dn = DNS response time test

cURL Example:

curl -X POST "https://gate.zequenze.com/api/v1/device_app_test_report/" \
  -H "Authorization: Bearer YOUR_API_TOKEN" \
  -H "Content-Type: application/json" \
  -d '{
    "network_name": "Office WiFi 5GHz",
    "network_type": "wi",
    "test_type": "dl",
    "destination": "speedtest.net",
    "value": 85.7,
    "count": 3,
    "unit": "Kbps"
  }'

Example Response:

{
  "network_name": "Office WiFi 5GHz",
  "network_type": "wi",
  "test_type": "dl",
  "destination": "speedtest.net",
  "value": 85.7,
  "count": 3,
  "unit": "Kbps"
}

Response Codes:

Status Description
201 Created - Test report successfully submitted and stored
400 Bad Request - Invalid data format or missing required fields
401 Unauthorized - Invalid or missing authentication token
422 Unprocessable Entity - Valid format but invalid field values

Common Use Cases

Use Case 1: Mobile Speed Test App

A mobile application performs regular speed tests and reports results to track network performance across different locations and carriers. The app conducts download tests every hour and submits results with location context.

Use Case 2: IoT Device Health Monitoring

Industrial IoT devices perform periodic connectivity checks to ensure reliable communication with cloud services. Devices test DNS response times and latency to critical endpoints every 15 minutes.

Use Case 3: Office Network Monitoring

An automated monitoring system tests internal network performance by measuring upload/download speeds to various cloud services, helping IT teams identify performance degradation before users are affected.

Use Case 4: Quality Assurance Testing

QA teams use automated tools to test application performance across different network conditions, submitting jitter and latency measurements to ensure applications meet performance standards.

Use Case 5: Network Troubleshooting

Network administrators deploy testing tools that continuously monitor connection quality to specific destinations, helping identify intermittent connectivity issues and performance bottlenecks.


Best Practices

  • Consistent Testing Intervals: Submit test reports at regular intervals to establish baseline performance metrics and identify trends over time.

  • Meaningful Network Names: Use descriptive network names that help identify the testing environment (e.g., "Corp_WiFi_Building_A" rather than generic names).

  • Appropriate Test Counts: When performing multiple test iterations, include the count parameter to provide context for the averaged or aggregated value.

  • Error Handling: Implement retry logic for failed submissions, as network conditions during testing might also affect API connectivity.

  • Data Validation: Validate test results before submission to ensure realistic values (e.g., don't submit negative latency or impossibly high speeds).

  • Destination Diversity: Test against multiple destinations to get a comprehensive view of network performance, including both local and geographically distributed endpoints.

  • Unit Consistency: Always specify units for your measurements to ensure proper data interpretation and comparison across different testing tools.