Best Practices
Firebase Realtime Database-specific patterns and features for migrations.
Table of contents
- Server Timestamps
- Transactions
- Multi-Path Updates
- Priority
- Queries
- Security Rules Considerations
- Working with JSON Data
- Handling Large Datasets
- Data Validation
- Firebase Emulator
- Performance Tips
Server Timestamps
Use Firebase server timestamps for consistent timing:
import * as admin from 'firebase-admin';
export const up: IMigrationScript<admin.database.Database>['up'] = async (db) => {
await db.ref('posts/post1').set({
title: 'Hello World',
createdAt: admin.database.ServerValue.TIMESTAMP
});
};
Transactions
Important: Firebase Realtime Database only supports single-node atomic transactions via ref.transaction(). Database-wide or multi-document transactions are NOT supported. See the Transactions Guide for details and safe migration patterns.
Use single-node transactions for atomic read-modify-write operations:
import { IRunnableScript, IMigrationInfo } from '@migration-script-runner/core';
import { IFirebaseDB, FirebaseHandler } from '@migration-script-runner/firebase';
export default class IncrementCounter implements IRunnableScript<IFirebaseDB> {
async up(
db: IFirebaseDB,
info: IMigrationInfo,
handler: FirebaseHandler
): Promise<string> {
const counterRef = db.database.ref(handler.cfg.buildPath('counters/posts'));
await counterRef.transaction((current) => {
return (current || 0) + 1;
});
return 'Incremented post counter';
}
}
For most migrations, use direct operations (set(), update()) without transactions. See Multi-Path Updates below for atomic multi-path operations.
Multi-Path Updates
Update multiple paths atomically:
export const up: IMigrationScript<admin.database.Database>['up'] = async (db) => {
const updates: Record<string, any> = {};
updates['users/user1/name'] = 'John Doe';
updates['usernames/johndoe'] = 'user1';
updates['meta/lastUpdate'] = admin.database.ServerValue.TIMESTAMP;
await db.ref().update(updates);
};
Priority
Set priorities for ordering:
export const up: IMigrationScript<admin.database.Database>['up'] = async (db) => {
await db.ref('items/item1').setWithPriority({
name: 'Item 1'
}, 1);
await db.ref('items/item2').setWithPriority({
name: 'Item 2'
}, 2);
};
Queries
Work with ordered data:
export const up: IMigrationScript<admin.database.Database>['up'] = async (db) => {
// Get last 10 items by timestamp
const snapshot = await db.ref('items')
.orderByChild('timestamp')
.limitToLast(10)
.once('value');
// Process results
const items: any[] = [];
snapshot.forEach((child) => {
items.push({ id: child.key, ...child.val() });
});
// Migrate items...
};
Security Rules Considerations
Migrations run with admin privileges, but remember to update security rules for client access.
// Migration creates new structure
export const up: IMigrationScript<admin.database.Database>['up'] = async (db) => {
await db.ref('publicData').set({
config: { theme: 'light' }
});
};
// Don't forget to update rules.json:
// {
// "rules": {
// "publicData": {
// ".read": true,
// ".write": false
// }
// }
// }
Working with JSON Data
Import/export JSON structures:
export const up: IMigrationScript<admin.database.Database>['up'] = async (db) => {
const data = {
users: {
user1: { name: 'John' },
user2: { name: 'Jane' }
},
posts: {
post1: { title: 'First Post', author: 'user1' }
}
};
await db.ref().set(data);
};
Handling Large Datasets
Process large datasets in batches:
export const up: IMigrationScript<admin.database.Database>['up'] = async (db) => {
const BATCH_SIZE = 100;
let lastKey: string | null = null;
while (true) {
let query = db.ref('users').orderByKey().limitToFirst(BATCH_SIZE);
if (lastKey) {
query = query.startAt(lastKey);
}
const snapshot = await query.once('value');
if (!snapshot.hasChildren()) {
break;
}
const updates: Record<string, any> = {};
let count = 0;
snapshot.forEach((child) => {
if (child.key !== lastKey) {
updates[`users/${child.key}/processed`] = true;
lastKey = child.key;
count++;
}
});
if (count > 0) {
await db.ref().update(updates);
}
if (count < BATCH_SIZE - 1) {
break;
}
}
};
Data Validation
Validate data structure before migration:
export const up: IMigrationScript<admin.database.Database>['up'] = async (db) => {
const snapshot = await db.ref('users').once('value');
const users = snapshot.val();
// Validate each user
Object.entries(users || {}).forEach(([key, user]: [string, any]) => {
if (!user.email || !user.name) {
throw new Error(`Invalid user data for ${key}`);
}
});
// Proceed with migration...
};
Firebase Emulator
Test migrations with Firebase emulator:
# Start emulator
firebase emulators:start --only database
# Run migrations against emulator
export FIREBASE_DATABASE_URL=http://localhost:9000
npx msr-firebase migrate
Performance Tips
1. Use Multi-Path Updates
Instead of:
await db.ref('users/user1/name').set('John');
await db.ref('users/user1/email').set('john@example.com');
Use:
await db.ref().update({
'users/user1/name': 'John',
'users/user1/email': 'john@example.com'
});
2. Batch Read Operations
Read related data in one query:
const snapshot = await db.ref('users').once('value');
// Process all users from one snapshot
3. Avoid Deep Queries
Design data structures to minimize deep queries:
// Good: Flat structure
{
"users": { "user1": {...} },
"posts": { "post1": {...} }
}
// Avoid: Deeply nested
{
"users": {
"user1": {
"posts": {
"post1": {...}
}
}
}
}