Skip to main content
POST
/
v1
/
locations
/
batch
Batch Upload Locations
curl --request POST \
  --url https://api.bookovia.com/v1/locations/batch \
  --header 'Content-Type: application/json' \
  --header 'X-API-Key: <api-key>' \
  --data '
{
  "trip_id": "<string>",
  "locations": [
    {
      "latitude": 123,
      "longitude": 123,
      "timestamp": "<string>",
      "speed_kmh": 123,
      "heading": 123,
      "accuracy_meters": 123,
      "altitude_meters": 123,
      "battery_level": 123,
      "network_type": "<string>"
    }
  ],
  "device_info": {
    "device_id": "<string>",
    "device_model": "<string>",
    "os_version": "<string>",
    "app_version": "<string>"
  }
}
'
{
  "error": {
    "code": "validation_failed",
    "message": "Location data validation failed",
    "details": {
      "locations[0].latitude": "Must be between -90 and 90",
      "locations[1].timestamp": "Invalid ISO 8601 format"
    }
  }
}

Documentation Index

Fetch the complete documentation index at: https://docs.bookovia.com/llms.txt

Use this file to discover all available pages before exploring further.

Overview

The Batch Upload Locations endpoint allows you to efficiently upload multiple GPS location points for active trips in a single API call. This endpoint is optimized for high-frequency location updates and includes real-time safety event detection.
Batch uploads can process up to 1000 location points per request. For larger datasets, split into multiple batch requests.

Authentication

This endpoint requires authentication via API key in the X-API-Key header. Required permissions: locations:write

Request

trip_id
string
required
Unique identifier for the active trip to upload locations to
locations
array
required
Array of location data points to upload
device_info
object
Device information for this batch of locations

Request Example

curl -X POST https://api.bookovia.com/v1/locations/batch \
  -H "X-API-Key: bkv_test_your_api_key_here" \
  -H "Content-Type: application/json" \
  -d '{
    "trip_id": "trip_1234567890abcdef",
    "locations": [
      {
        "latitude": 40.7128,
        "longitude": -74.0060,
        "timestamp": "2024-04-13T10:30:00Z",
        "speed_kmh": 45.2,
        "heading": 87,
        "accuracy_meters": 5.2,
        "altitude_meters": 15.8,
        "battery_level": 85,
        "network_type": "cellular"
      },
      {
        "latitude": 40.7135,
        "longitude": -74.0055,
        "timestamp": "2024-04-13T10:30:15Z", 
        "speed_kmh": 47.1,
        "heading": 89,
        "accuracy_meters": 4.8,
        "altitude_meters": 16.2,
        "battery_level": 85,
        "network_type": "cellular"
      }
    ],
    "device_info": {
      "device_id": "device_abc123",
      "device_model": "iPhone 14",
      "os_version": "iOS 17.2",
      "app_version": "2.4.1"
    }
  }'

Response

trip_id
string
Unique identifier for the trip that received the location updates
processed_count
integer
Number of location points successfully processed
rejected_count
integer
Number of location points that were rejected due to validation errors
events_detected
array
Array of safety events detected from the uploaded location data
trip_analytics
object
Updated real-time trip analytics

Success Response

{
  "trip_id": "trip_1234567890abcdef",
  "processed_count": 2,
  "rejected_count": 0,
  "events_detected": [
    {
      "event_id": "evt_harsh_accel_001",
      "event_type": "harsh_acceleration",
      "timestamp": "2024-04-13T10:30:08Z",
      "severity": "moderate",
      "location": {
        "latitude": 40.7132,
        "longitude": -74.0058
      },
      "metrics": {
        "acceleration_g": 0.38,
        "speed_change_kmh": 25.4,
        "duration_seconds": 3.2
      }
    }
  ],
  "trip_analytics": {
    "current_distance_km": 15.7,
    "current_duration_minutes": 75,
    "current_safety_score": 89,
    "locations_count": 189,
    "last_location": {
      "latitude": 40.7135,
      "longitude": -74.0055,
      "timestamp": "2024-04-13T10:30:15Z",
      "speed_kmh": 47.1
    },
    "avg_speed_kmh": 32.1
  },
  "processing_time_ms": 45
}

Error Responses

{
  "error": {
    "code": "validation_failed",
    "message": "Location data validation failed",
    "details": {
      "locations[0].latitude": "Must be between -90 and 90",
      "locations[1].timestamp": "Invalid ISO 8601 format"
    }
  }
}

SDK Examples

import Bookovia from '@bookovia/javascript-sdk';

const client = new Bookovia('bkv_test_your_api_key');

// Simple batch upload
const result = await client.locations.batchUpload('trip_1234567890abcdef', [
  {
    latitude: 40.7128,
    longitude: -74.0060,
    timestamp: new Date().toISOString(),
    speed_kmh: 45.2,
    heading: 87
  },
  {
    latitude: 40.7135,
    longitude: -74.0055,
    timestamp: new Date(Date.now() + 15000).toISOString(),
    speed_kmh: 47.1,
    heading: 89
  }
]);

console.log(`Processed ${result.processed_count} locations`);

// Batch upload with device info and error handling
try {
  const advancedResult = await client.locations.batchUpload('trip_1234567890abcdef', 
    locationPoints, 
    {
      device_info: {
        device_id: 'device_abc123',
        device_model: navigator.userAgent,
        app_version: '2.4.1'
      }
    }
  );
  
  if (advancedResult.events_detected.length > 0) {
    console.warn('Safety events detected:', advancedResult.events_detected);
  }
} catch (error) {
  if (error.code === 'batch_too_large') {
    // Split into smaller batches
    const chunks = chunkArray(locationPoints, 500);
    for (const chunk of chunks) {
      await client.locations.batchUpload(tripId, chunk);
    }
  }
}

Use Cases

Real-time Fleet Tracking

// Continuous location streaming for fleet management
const trackFleetVehicle = async (tripId, gpsReceiver) => {
  const locationBuffer = [];
  const BATCH_SIZE = 10;
  const UPLOAD_INTERVAL = 30; // seconds
  
  gpsReceiver.on('location', (location) => {
    locationBuffer.push({
      latitude: location.lat,
      longitude: location.lng,
      timestamp: new Date().toISOString(),
      speed_kmh: location.speed,
      heading: location.course,
      accuracy_meters: location.accuracy
    });
    
    // Upload when buffer reaches batch size
    if (locationBuffer.length >= BATCH_SIZE) {
      uploadBatch(tripId, locationBuffer.splice(0, BATCH_SIZE));
    }
  });
  
  // Upload remaining locations periodically
  setInterval(() => {
    if (locationBuffer.length > 0) {
      uploadBatch(tripId, locationBuffer.splice(0));
    }
  }, UPLOAD_INTERVAL * 1000);
};

const uploadBatch = async (tripId, locations) => {
  try {
    const result = await client.locations.batchUpload(tripId, locations);
    
    // Alert on safety events
    if (result.events_detected.length > 0) {
      result.events_detected.forEach(event => {
        if (event.severity === 'high') {
          sendAlertToFleetManager(tripId, event);
        }
      });
    }
  } catch (error) {
    console.error('Failed to upload batch:', error);
  }
};

Mobile App Background Sync

# Background location sync for mobile applications
import asyncio
import sqlite3
from datetime import datetime

class LocationSyncManager:
    def __init__(self, client, db_path):
        self.client = client
        self.db = sqlite3.connect(db_path)
        self.setup_database()
    
    def setup_database(self):
        self.db.execute('''
            CREATE TABLE IF NOT EXISTS pending_locations (
                id INTEGER PRIMARY KEY,
                trip_id TEXT,
                latitude REAL,
                longitude REAL,
                timestamp TEXT,
                speed_kmh REAL,
                synced BOOLEAN DEFAULT 0
            )
        ''')
    
    async def sync_pending_locations(self):
        """Sync all pending locations to the server"""
        cursor = self.db.execute(
            "SELECT trip_id, latitude, longitude, timestamp, speed_kmh, id FROM pending_locations WHERE synced = 0"
        )
        
        # Group by trip_id for batch uploads
        trips_data = {}
        location_ids = []
        
        for row in cursor.fetchall():
            trip_id = row[0]
            location_data = {
                'latitude': row[1],
                'longitude': row[2], 
                'timestamp': row[3],
                'speed_kmh': row[4]
            }
            
            if trip_id not in trips_data:
                trips_data[trip_id] = []
            
            trips_data[trip_id].append(location_data)
            location_ids.append(row[5])
        
        # Upload each trip's locations
        for trip_id, locations in trips_data.items():
            try:
                await self.client.locations.batch_upload(
                    trip_id=trip_id,
                    locations=locations
                )
                
                # Mark as synced
                placeholders = ','.join(['?' for _ in location_ids])
                self.db.execute(
                    f"UPDATE pending_locations SET synced = 1 WHERE id IN ({placeholders})",
                    location_ids
                )
                self.db.commit()
                
            except Exception as e:
                print(f"Failed to sync locations for trip {trip_id}: {e}")

# Usage
sync_manager = LocationSyncManager(client, 'location_cache.db')
await sync_manager.sync_pending_locations()

IoT Device Integration

// GPS tracker integration for IoT devices
package main

import (
    "context"
    "encoding/json"
    "log"
    "time"
    "github.com/bookovia/go-sdk"
)

type GPSTracker struct {
    client     *bookovia.Client
    deviceID   string
    bufferSize int
    buffer     []*bookovia.Location
}

func NewGPSTracker(client *bookovia.Client, deviceID string) *GPSTracker {
    return &GPSTracker{
        client:     client,
        deviceID:   deviceID,
        bufferSize: 50,
        buffer:     make([]*bookovia.Location, 0, 50),
    }
}

func (g *GPSTracker) ProcessGPSMessage(tripID string, gpsData []byte) error {
    var location struct {
        Lat       float64 `json:"lat"`
        Lng       float64 `json:"lng"`
        Speed     float64 `json:"speed"`
        Heading   float64 `json:"heading"`
        Timestamp int64   `json:"timestamp"`
    }
    
    if err := json.Unmarshal(gpsData, &location); err != nil {
        return err
    }
    
    // Add to buffer
    g.buffer = append(g.buffer, &bookovia.Location{
        Latitude:  location.Lat,
        Longitude: location.Lng,
        Timestamp: time.Unix(location.Timestamp, 0),
        SpeedKmh:  location.Speed,
        Heading:   location.Heading,
    })
    
    // Upload when buffer is full
    if len(g.buffer) >= g.bufferSize {
        return g.flushBuffer(tripID)
    }
    
    return nil
}

func (g *GPSTracker) flushBuffer(tripID string) error {
    if len(g.buffer) == 0 {
        return nil
    }
    
    result, err := g.client.Locations.BatchUpload(context.Background(), &bookovia.BatchUploadRequest{
        TripID:    tripID,
        Locations: g.buffer,
        DeviceInfo: &bookovia.DeviceInfo{
            DeviceID:    g.deviceID,
            DeviceModel: "GPS Tracker v2",
            OSVersion:   "embedded",
            AppVersion:  "1.0.0",
        },
    })
    
    if err != nil {
        return err
    }
    
    log.Printf("Uploaded %d locations, detected %d events", 
        result.ProcessedCount, len(result.EventsDetected))
    
    // Clear buffer
    g.buffer = g.buffer[:0]
    return nil
}

Best Practices

Optimization

  • Use batch sizes of 50-500 locations for optimal performance
  • Larger batches reduce API calls but increase memory usage
  • Consider network conditions and device capabilities
  • Upload every 30-60 seconds for real-time tracking
  • Use shorter intervals (5-15 seconds) for high-risk scenarios
  • Buffer locations during network outages and sync when reconnected

Data Quality

  • Filter out locations with accuracy > 50 meters
  • Validate coordinates are within expected geographical bounds
  • Remove duplicate timestamps or locations
// Ensure locations are in chronological order
const sortedLocations = locations.sort((a, b) => 
  new Date(a.timestamp) - new Date(b.timestamp)
);

Next Steps

Single Location Upload

Upload individual location points in real-time

Get Route Data

Retrieve detailed route information and analysis

Safety Analytics

Analyze safety events detected during uploads

Real-time Streaming

Set up continuous location streaming