PRD: Project Metadata Management
📋 Implementation Issue: Issue #227 - Project Metadata Management - Phase 1 (MVP)
Overview
This PRD defines the requirements for making project metadata editable in the CodeProof application. Currently, project metadata (Project Name, Description, etc.) is read-only after project creation. This feature will enable users to update project metadata through the UI, backed by new dedicated gRPC APIs for creating and updating project metadata files.
What's New: This PRD introduces two new gRPC RPCs:
CreateProjectMetadata- Createsproject-metadata.jsonfile independently (for backfilling legacy projects)UpdateProjectMetadata- Updates existingproject-metadata.jsonfile (for editing metadata)
Problem Statement
Current State
- Project metadata is read-only: Once a project is created, users cannot update the Project Name or Description through the UI
- Metadata creation is coupled with project creation: The
CreateArchitecturalPlanRequestRPC creates the entire project structure (folders + metadata), not just the metadata file - No dedicated metadata message: The
GetProjectMetadataResponseis used for both the RPC response and to represent the metadata file structure - Limited metadata fields: Current metadata only includes basic fields (ID, name, description, creator, timestamp, status)
- Legacy projects lack project-metadata.json: Older projects created before the introduction of
project-metadata.jsononly haveplan-metadata.jsonfiles, which contain page information but not project-level metadata
Metadata File Hierarchy
The application uses multiple metadata files at different levels:
1. project-metadata.json (Project-Level) - THIS PRD
- Location:
projects/{projectId}/project-metadata.json - Contains: Project-level metadata (ID, name, description, creator, timestamps, status)
- Scope: Entire project
- Created by:
- Currently:
CreateArchitecturalPlanRequestRPC (when creating new projects) - NEW (this PRD):
CreateProjectMetadataRPC (for backfilling legacy projects and standalone metadata creation)
- Currently:
- Format: Proto JSON with
ProjectMetadatamessage - Status: New (introduced recently), some legacy projects lack this file
2. plan-metadata.json (Page List) - LEGACY
- Location:
projects/{projectId}/plan-metadata.json - Contains: List of processed pages (
ArchitecturalPlanproto withpagesarray) - Scope: All pages in the project (flat structure)
- Created by: Page ingestion process (dynamically updated as pages are processed)
- Format: Proto JSON containing
ArchitecturalPlanPageobjects - Status: Legacy, will eventually be superseded by file-level metadata (see Issue #167)
- Future: Will be deprecated once Issue #167 is implemented
3. files/{file_id}/metadata.json (File-Level) - FUTURE (Issue #167)
- Location:
projects/{projectId}/files/{file_id}/metadata.json - Contains: Rich file metadata (file info, document type, processing status, page list for that file)
- Scope: Single input file and its extracted pages
- Created by:
GenerateInputFileMetadataRPC (to be implemented in Issue #167) - Format: Proto JSON with
InputFileMetadatamessage - Status: Planned, not yet implemented
- Future: Will replace
plan-metadata.jsonfor page tracking
Current Fallback Logic:
listArchitecturalPlanIds()considers a project valid if it has eitherplan-metadata.jsonorproject-metadata.jsongetArchitecturalPlan()tries to loadplan-metadata.jsonfirst, then checks forproject-metadata.jsonas a fallback indicator that the project exists- Legacy projects with only
plan-metadata.jsonare still functional but lack project-level metadata
Future Compatibility (Issue #167):
project-metadata.json(project-level) andfiles/{file_id}/metadata.json(file-level) serve different purposes and will coexistplan-metadata.jsonwill be deprecated once file-level metadata is implemented- Migration path:
plan-metadata.json→files/{file_id}/metadata.json(per-file page lists) project-metadata.jsonremains unchanged and continues to store project-level information
Visual Hierarchy:
Current State:
projects/{projectId}/
├── project-metadata.json ← THIS PRD (project-level: name, description, etc.)
├── plan-metadata.json ← LEGACY (flat list of all pages)
├── pages/ ← Flat structure
│ ├── 001/
│ ├── 002/
│ └── ...
└── inputs/
Future State (after Issue #167):
projects/{projectId}/
├── project-metadata.json ← THIS PRD (unchanged, project-level metadata)
├── plan-metadata.json ← DEPRECATED (replaced by file-level metadata)
├── files/ ← NEW (Issue #167)
│ ├── {file_id_1}/
│ │ ├── metadata.json ← NEW (file-level: pages from this file)
│ │ └── pages/
│ │ ├── 001/
│ │ └── 002/
│ └── {file_id_2}/
│ ├── metadata.json ← NEW (file-level: pages from this file)
│ └── pages/
│ ├── 003/
│ └── 004/
└── inputs/
Key Insight:
- This PRD = Project-level metadata (name, description, address, etc.)
- Issue #167 = File-level metadata (which pages came from which file, document type, etc.)
- These are orthogonal concerns and both are needed
User Impact
- Users cannot correct typos in project names or descriptions
- Users cannot add more context to projects after creation
- Users cannot update project metadata as requirements evolve
- No way to add structured metadata like address, occupancy type, jurisdiction, etc.
- Legacy projects appear in listings but have no editable project metadata
Implementation Phases
Phase 1: Core Metadata Editing (MVP)
Scope: Make basic project metadata editable
Goals:
- Enable users to edit Project Name and Description through the UI
- Create dedicated gRPC APIs for metadata management (separate from full project creation)
- Define
ProjectMetadataproto message with core universal fields only:- Basic info: project_id, project_name, description, created_by, created_at, status, updated_by, updated_at
- Universal details: address, primary_building_code, jurisdiction, project_type
- Implement lightweight, synchronous gRPC operations (not Cloud Run Jobs)
- Enable backfilling/upgrading legacy projects: Create framework to generate
project-metadata.jsonfor legacy projects - Two workflows: Individual user upgrade (UI banner) + Admin bulk upgrade (CLI)
What's NOT in Phase 1:
- ❌ Code-specific metadata (IBC occupancy, construction type, etc.)
- ❌ Pluggable code metadata architecture
- ❌ IBC-specific enums and submessages
Deliverables:
- ✅
ProjectMetadataproto with core fields - ✅
CreateProjectMetadataandUpdateProjectMetadataRPCs - ✅ Backend service (
ProjectMetadataService) - ✅ Frontend inline editing UI
- ✅ Legacy project detection and upgrade banner
- ✅ CLI tool for bulk upgrades
Phase 2: Pluggable Code Metadata Architecture
Scope: Add extensible code-specific metadata support
Goals:
- Implement
CodeMetadatawrapper withoneoffor type safety - Support multiple building codes per project
- Add IBC-specific metadata structure (organized by chapter)
- Define enums for IBC occupancy groups and construction types
- Update UI to show/hide code-specific forms
Deliverables:
- ✅
CodeMetadatamessage withoneof - ✅
IbcCodeMetadatawith chapter-based organization - ✅
IbcOccupancyGroupandIbcConstructionTypeEnumenums - ✅ IBC submessages (occupancy, construction, height/area, fire protection, fire separation)
- ✅ Frontend code-specific forms
- ✅ Backend merge logic for multiple codes
Phase 3: Extended Code Support
Scope: Add support for additional building codes
Goals:
- Implement IRC (International Residential Code) metadata
- Implement NFPA (National Fire Protection Association) metadata
- Implement state-specific codes (California CBC, Florida FBC, etc.)
- Add validation rules per code type
- Code-specific compliance checking
Deliverables:
- ✅
IrcCodeMetadatamessage - ✅
NfpaCodeMetadatamessage - ✅
CaliforniaBuildingCodeMetadatamessage - ✅
FloridaBuildingCodeMetadatamessage - ✅ Code-specific validation logic
- ✅ Multi-code compliance reports
Phase Comparison
| Feature | Phase 1 (MVP) | Phase 2 | Phase 3 |
|---|---|---|---|
| Edit project name/description | ✅ Yes | ✅ Yes | ✅ Yes |
| Edit address | ✅ Yes | ✅ Yes | ✅ Yes |
| Edit jurisdiction | ✅ Yes | ✅ Yes | ✅ Yes |
| Edit project type | ✅ Yes | ✅ Yes | ✅ Yes |
| Legacy project upgrade | ✅ Yes | ✅ Yes | ✅ Yes |
| Admin bulk upgrade CLI | ✅ Yes | ✅ Yes | ✅ Yes |
| IBC occupancy groups | ❌ No | ✅ Yes | ✅ Yes |
| IBC construction types | ❌ No | ✅ Yes | ✅ Yes |
| IBC height/area metadata | ❌ No | ✅ Yes | ✅ Yes |
| IBC fire protection | ❌ No | ✅ Yes | ✅ Yes |
| Multiple codes support | ❌ No | ✅ Yes | ✅ Yes |
| IRC metadata | ❌ No | ❌ No | ✅ Yes |
| NFPA metadata | ❌ No | ❌ No | ✅ Yes |
| State code amendments | ❌ No | ❌ No | ✅ Yes |
Migration Path: Each phase adds new fields without breaking previous phases. Phase 1 projects work perfectly in Phase 2/3 without modification.
Non-Goals
- Editing Project ID (immutable after creation)
- Editing Created By or Created At timestamps (audit trail)
- Bulk metadata updates across multiple projects
- Metadata versioning or change history (future consideration)
- Archiving or deleting projects (separate feature)
User Stories
Story 1: Edit Project Name
As a project owner
I want to edit the project name
So that I can correct typos or update the name to better reflect the project
Acceptance Criteria:
- Edit button appears next to Project Name in Settings view
- Clicking edit shows an inline text field
- Changes are saved via gRPC call
- Success/error feedback is shown
- Updated name appears immediately in UI and top app bar
Story 2: Edit Project Description
As a project owner
I want to edit the project description
So that I can add more context or update project details
Acceptance Criteria:
- Edit button appears next to Description in Settings view
- Clicking edit shows a multi-line text area
- Changes are saved via gRPC call
- Success/error feedback is shown
- Updated description appears immediately in UI
Story 3: Upgrade Legacy Project (MVP)
As a project owner
I want to be prompted to set up metadata when I open a legacy project
So that I can provide proper project information and enable full editing capabilities
Acceptance Criteria:
- When opening a legacy project (has
plan-metadata.jsonbut noproject-metadata.json), user sees an informational banner - Banner message: "This project needs metadata setup. Please provide project details to enable full editing."
- Banner has "Set Up Metadata" button
- Clicking button opens a dialog with fields for Project Name and Description
- Project Name defaults to the project ID (editable)
- Description is empty (optional)
- After submitting,
CreateProjectMetadataRPC is called - On success, banner disappears and metadata becomes editable
- User can dismiss banner temporarily, but it reappears on next visit until metadata is created
- Upgrade process does not disrupt existing project functionality (pages, files, etc. remain accessible)
Story 4: Bulk Upgrade Legacy Projects (Admin)
As a system administrator
I want to upgrade multiple legacy projects in bulk
So that I can ensure all projects in the system have proper metadata without waiting for individual users
Acceptance Criteria:
- Admin can identify legacy projects via CLI command or admin UI
- Admin can run bulk upgrade command with configurable defaults
- Command supports dry-run mode to preview changes
- Each project gets metadata with sensible defaults (project ID as name, etc.)
- Bulk operation is idempotent (safe to run multiple times)
- Operation logs success/failure for each project
- Existing project functionality is not disrupted
- Users can still edit metadata after bulk upgrade
Story 5: Add Extended Metadata (Future)
As a project owner
I want to add structured metadata like address and occupancy type
So that I can better organize and categorize my projects
Acceptance Criteria:
- Additional metadata fields appear in Settings view
- Fields are optional and can be left blank
- Changes are saved to the same
project-metadata.jsonfile - Metadata is preserved during project copy operations
Technical Design
📄 See: Technical Design Document
The detailed technical design, including proto definitions, backend implementation, frontend components, and CLI tools, has been moved to a separate Technical Design Document (TDD) for better organization.
Key Technical Components:
- Extensible Proto Schema:
ProjectMetadata- Core universal fields + pluggable code metadataCodeMetadata- Wrapper for multiple building codes (type-safe viaoneof)IbcCodeMetadata- IBC-specific metadata organized by chapterIrcCodeMetadata,NfpaCodeMetadata, etc. - Future code supportProjectAddress- Structured address information
- gRPC RPCs:
CreateProjectMetadata,UpdateProjectMetadata - Backend Service:
ProjectMetadataServicewith code-aware merging logic - Frontend Components: Angular components with inline editing and code-specific forms
- CLI Tool:
BackfillProjectMetadataCommandfor bulk legacy project upgrades - Migration Strategy: Backward compatible, gradual adoption
Design Philosophy:
- ✅ Universal Core: Basic fields work for any code/jurisdiction
- ✅ Type-Safe Plugins: Multiple codes via
repeated CodeMetadatawithoneof - ✅ IBC-Aligned: First-class support for IBC with chapter-based organization
- ✅ Extensible: Easy to add IRC, NFPA, state codes without breaking changes
- ✅ Gradual Adoption: Start simple, add code metadata incrementally
For complete implementation details, refer to the TDD.
Success Metrics
User Engagement
- % of users who edit project metadata after creation
- Average time between project creation and first metadata edit
- Number of metadata fields updated per project
System Performance
- RPC latency for UpdateProjectMetadata (target: < 500ms)
- Success rate of metadata updates (target: > 99%)
- No increase in project load times
User Satisfaction
- Reduction in support tickets about incorrect project names
- User feedback on metadata editing experience
Risks and Mitigations
| Risk | Impact | Mitigation |
|---|---|---|
Breaking existing project-metadata.json files | High | Ensure backward compatibility, test with existing projects |
Breaking legacy projects with only plan-metadata.json | High | Maintain dual-file fallback logic, test with legacy projects |
| Performance degradation on metadata reads | Medium | Use efficient JSON parsing, cache metadata in memory |
| Concurrent update conflicts | Low | Document last-write-wins behavior, consider optimistic locking in future |
| Permission bypass vulnerabilities | High | Implement RBAC checks in backend, never trust frontend |
| Backfilling creates incorrect metadata | Medium | Prompt users for manual input, validate defaults, allow editing after backfill |
| Users confused by legacy project banner | Low | Clear messaging, easy-to-use backfill dialog, optional dismissal |
Future Enhancements
- Admin UI for bulk upgrades: Web-based admin console for bulk operations (currently CLI-only)
- Metadata versioning: Track history of metadata changes with rollback capability
- Bulk metadata updates: Update multiple projects at once via UI (not just upgrade, but edit)
- Metadata templates: Pre-fill common project types (residential, commercial, etc.)
- Metadata validation: Enforce required fields, validate formats (e.g., zip code format)
- Metadata search: Search projects by metadata fields in project list
- Custom metadata fields: Allow users to define custom fields per organization
- Metadata export: Export metadata to CSV/Excel for reporting
- Automatic upgrade prompt: Show banner on project list page for legacy projects (proactive)
- Scheduled bulk upgrades: Cron job to automatically upgrade legacy projects during off-hours
Integration Testing
This section defines deterministic integration tests as bash scripts using grpcurl to validate the success criteria for each user story. These tests serve as:
- Executable specifications - Clear contract before implementation
- Acceptance criteria validation - Concrete examples of expected behavior
- Translation targets - Can be converted to Java (backend) and Cypress (frontend) tests during implementation
Integration Test Execution Flow
┌─────────────────────────────────────────────────────────────┐
│ Human Admin (One-time Setup) │
│ ┌──────────────────────────────────────────────────────┐ │
│ │ 1. Copy LEGACY-BASELINE-PROJECT from dev to test │ │
│ │ (using copy-project-between-envs.sh) │ │
│ │ │ │
│ │ 2. Verify baseline is legacy │ │
│ │ (has plan-metadata.json only) │ │
│ └──────────────────────────────────────────────────────┘ │
└─────────────────────────────────────────────────────────────┘
│
▼
┌─────────────────────────────────────────────────────────────┐
│ AI Agent (Automated Integration Tests) │
│ ┌──────────────────────────────────────────────────────┐ │
│ │ Test Run 1: │ │
│ │ 1. Clone baseline → TEST-CLONE-1 │ │
│ │ 2. Run upgrade tests on TEST-CLONE-1 │ │
│ │ 3. Delete TEST-CLONE-1 │ │
│ └──────────────────────────────────────────────────────┘ │
│ │
│ ┌──────────────────────────────────────────────────────┐ │
│ │ Test Run 2 (parallel): │ │
│ │ 1. Clone baseline → TEST-CLONE-2 │ │
│ │ 2. Run upgrade tests on TEST-CLONE-2 │ │
│ │ 3. Delete TEST-CLONE-2 │ │
│ └──────────────────────────────────────────────────────┘ │
│ │
│ Baseline remains unchanged ✅ │
└─────────────────────────────────────────────────────────────┘
Key Benefits:
- ✅ AI agent has no dev access - Human admin handles cross-environment copy
- ✅ Disposable test clones - Each test run creates/destroys its own clone
- ✅ Baseline preserved - Original legacy project remains unchanged
- ✅ Parallel test execution - Multiple tests can run simultaneously with different clones
- ✅ Clean test environment - No leftover test data after completion
Test Environment Setup
#!/bin/bash
# test-setup.sh - Common setup for all integration tests
# Environment configuration
export GRPC_HOST="${GRPC_HOST:-localhost:8080}"
export TEST_USER_ID="${TEST_USER_ID:-test-user@example.com}"
export TEST_PROJECT_ID="${TEST_PROJECT_ID:-TEST-$(date +%s)}"
# Proto paths
export PROTO_PATH="src/main/proto"
export GOOGLEAPIS_PATH="env/dependencies/googleapis"
# Helper function: Make gRPC call
grpc_call() {
local service=$1
local method=$2
local data=$3
grpcurl -plaintext \
-import-path "${PROTO_PATH}" \
-import-path "${GOOGLEAPIS_PATH}" \
-proto "${PROTO_PATH}/api.proto" \
-d "${data}" \
"${GRPC_HOST}" \
"org.codetricks.construction.code.assistant.service.${service}/${method}"
}
# Helper function: Assert field value in JSON response
assert_field_equals() {
local json=$1
local field=$2
local expected=$3
local actual=$(echo "${json}" | jq -r ".${field}")
if [ "${actual}" != "${expected}" ]; then
echo "❌ FAILED: Expected ${field}='${expected}', got '${actual}'"
return 1
fi
echo "✅ PASSED: ${field}='${expected}'"
return 0
}
# Helper function: Assert field exists
assert_field_exists() {
local json=$1
local field=$2
local value=$(echo "${json}" | jq -r ".${field}")
if [ "${value}" == "null" ] || [ -z "${value}" ]; then
echo "❌ FAILED: Field ${field} does not exist or is null"
return 1
fi
echo "✅ PASSED: Field ${field} exists with value '${value}'"
return 0
}
Test 1: Create and Edit Project Metadata (Stories 1 & 2)
File: tests/integration/test-project-metadata-crud.sh
#!/bin/bash
# Test: Create project metadata, then edit name and description
# Covers: Story 1 (Edit Project Name) and Story 2 (Edit Project Description)
source tests/integration/test-setup.sh
set -e # Exit on any error
echo "=========================================="
echo "Test: Project Metadata CRUD Operations"
echo "=========================================="
echo "Project ID: ${TEST_PROJECT_ID}"
echo "User ID: ${TEST_USER_ID}"
echo ""
# Step 1: Create a new architectural plan with initial metadata
echo "Step 1: Creating architectural plan with initial metadata..."
CREATE_RESPONSE=$(grpc_call "ArchitecturalPlanService" "CreateArchitecturalPlan" '{
"account_id": "'"${TEST_USER_ID}"'",
"architectural_plan_id": "'"${TEST_PROJECT_ID}"'",
"project_name": "Initial Project Name",
"project_description": "Initial project description"
}')
echo "${CREATE_RESPONSE}" | jq .
assert_field_exists "${CREATE_RESPONSE}" "architectural_plan_id"
echo ""
# Step 2: Get project metadata to verify initial state
echo "Step 2: Getting project metadata to verify initial state..."
GET_RESPONSE=$(grpc_call "ProjectMetadataService" "GetProjectMetadata" '{
"architectural_plan_id": "'"${TEST_PROJECT_ID}"'"
}')
echo "${GET_RESPONSE}" | jq .
assert_field_equals "${GET_RESPONSE}" "metadata.project_name" "Initial Project Name"
assert_field_equals "${GET_RESPONSE}" "metadata.project_description" "Initial project description"
echo ""
# Step 3: Update project name (Story 1)
echo "Step 3: Updating project name..."
UPDATE_NAME_RESPONSE=$(grpc_call "ProjectMetadataService" "UpdateProjectMetadata" '{
"architectural_plan_id": "'"${TEST_PROJECT_ID}"'",
"project_name": "Updated Project Name"
}')
echo "${UPDATE_NAME_RESPONSE}" | jq .
assert_field_equals "${UPDATE_NAME_RESPONSE}" "metadata.project_name" "Updated Project Name"
# Description should remain unchanged (PATCH semantics)
assert_field_equals "${UPDATE_NAME_RESPONSE}" "metadata.project_description" "Initial project description"
echo ""
# Step 4: Update project description (Story 2)
echo "Step 4: Updating project description..."
UPDATE_DESC_RESPONSE=$(grpc_call "ProjectMetadataService" "UpdateProjectMetadata" '{
"architectural_plan_id": "'"${TEST_PROJECT_ID}"'",
"project_description": "Updated project description with more details"
}')
echo "${UPDATE_DESC_RESPONSE}" | jq .
assert_field_equals "${UPDATE_DESC_RESPONSE}" "metadata.project_name" "Updated Project Name"
assert_field_equals "${UPDATE_DESC_RESPONSE}" "metadata.project_description" "Updated project description with more details"
echo ""
# Step 5: Get metadata again to verify persistence
echo "Step 5: Verifying changes persisted..."
FINAL_GET_RESPONSE=$(grpc_call "ProjectMetadataService" "GetProjectMetadata" '{
"architectural_plan_id": "'"${TEST_PROJECT_ID}"'"
}')
echo "${FINAL_GET_RESPONSE}" | jq .
assert_field_equals "${FINAL_GET_RESPONSE}" "metadata.project_name" "Updated Project Name"
assert_field_equals "${FINAL_GET_RESPONSE}" "metadata.project_description" "Updated project description with more details"
echo ""
echo "=========================================="
echo "✅ All tests passed!"
echo "=========================================="
Expected Outcomes:
- ✅ Project created with initial metadata
- ✅ Initial metadata retrieved successfully
- ✅ Project name updated, description unchanged (PATCH semantics)
- ✅ Project description updated, name unchanged
- ✅ Changes persisted across API calls
Test 2: Upgrade Legacy Project (Story 3)
File: tests/integration/test-legacy-project-upgrade.sh
Prerequisites:
-
Manual Setup (by human admin): Copy a legacy project from
devtotestenvironment:# Run this from project root (requires dev environment access)
./cli/sdlc/utils/copy-project-between-envs.sh LEGACY-BASELINE-PROJECT \
--source dev \
--target test \
--share ai-swe-agent@codetricks.org -
Verify baseline project has only
plan-metadata.json(noproject-metadata.json)
Test Workflow: The integration test will:
- Clone the baseline legacy project within the test environment (using
CopyArchitecturalPlangRPC) - Run upgrade tests on the disposable clone
- Clean up the clone after test completion
#!/bin/bash
# Test: Detect and upgrade legacy project (only has plan-metadata.json)
# Covers: Story 3 (Upgrade Legacy Project)
source tests/integration/test-setup.sh
set -e
echo "=========================================="
echo "Test: Legacy Project Upgrade"
echo "=========================================="
echo ""
# Configuration
BASELINE_LEGACY_PROJECT="${BASELINE_LEGACY_PROJECT:-LEGACY-BASELINE-PROJECT}"
TEST_CLONE_ID="TEST-LEGACY-CLONE-$(date +%s)"
echo "📋 Test Configuration:"
echo " Baseline Project: ${BASELINE_LEGACY_PROJECT}"
echo " Test Clone ID: ${TEST_CLONE_ID}"
echo " Test User: ${TEST_USER_ID}"
echo ""
# Step 1: Clone the baseline legacy project for testing
echo "Step 1: Cloning baseline legacy project for testing..."
CLONE_RESPONSE=$(grpc_call "ArchitecturalPlanService" "CopyArchitecturalPlan" '{
"source_architectural_plan_id": "'"${BASELINE_LEGACY_PROJECT}"'",
"target_architectural_plan_id": "'"${TEST_CLONE_ID}"'",
"account_id": "'"${TEST_USER_ID}"'"
}')
echo "${CLONE_RESPONSE}" | jq .
assert_field_exists "${CLONE_RESPONSE}" "architectural_plan_id"
echo "✅ Created test clone: ${TEST_CLONE_ID}"
echo ""
# Step 2: Verify clone is a legacy project (no project-metadata.json)
echo "Step 2: Verifying clone is a legacy project..."
GET_LEGACY_RESPONSE=$(grpc_call "ProjectMetadataService" "GetProjectMetadata" '{
"architectural_plan_id": "'"${TEST_CLONE_ID}"'"
}' 2>&1 || echo '{"error": "not_found"}')
echo "${GET_LEGACY_RESPONSE}" | jq . 2>/dev/null || echo "${GET_LEGACY_RESPONSE}"
# Check if response indicates legacy project (no metadata or specific error)
if echo "${GET_LEGACY_RESPONSE}" | grep -q "not_found\|NotFound\|NOT_FOUND"; then
echo "✅ PASSED: Legacy project detected (no project-metadata.json)"
else
echo "⚠️ WARNING: Expected legacy project detection"
echo " Response: ${GET_LEGACY_RESPONSE}"
fi
echo ""
# Step 3: Upgrade legacy project (backfill metadata)
echo "Step 3: Upgrading legacy project (backfilling metadata)..."
UPGRADE_RESPONSE=$(grpc_call "ProjectMetadataService" "CreateProjectMetadata" '{
"architectural_plan_id": "'"${TEST_CLONE_ID}"'",
"project_name": "'"${TEST_CLONE_ID}"'",
"project_description": ""
}')
echo "${UPGRADE_RESPONSE}" | jq .
assert_field_equals "${UPGRADE_RESPONSE}" "metadata.project_name" "${TEST_CLONE_ID}"
assert_field_equals "${UPGRADE_RESPONSE}" "metadata.project_description" ""
echo ""
# Step 4: Verify project is now upgraded
echo "Step 4: Verifying project is now upgraded..."
GET_UPGRADED_RESPONSE=$(grpc_call "ProjectMetadataService" "GetProjectMetadata" '{
"architectural_plan_id": "'"${TEST_CLONE_ID}"'"
}')
echo "${GET_UPGRADED_RESPONSE}" | jq .
assert_field_exists "${GET_UPGRADED_RESPONSE}" "metadata.project_name"
assert_field_exists "${GET_UPGRADED_RESPONSE}" "metadata.project_description"
echo ""
# Step 5: User edits the backfilled metadata
echo "Step 5: User edits backfilled metadata..."
EDIT_RESPONSE=$(grpc_call "ProjectMetadataService" "UpdateProjectMetadata" '{
"architectural_plan_id": "'"${TEST_CLONE_ID}"'",
"project_name": "My Upgraded Project",
"project_description": "Added description after upgrade"
}')
echo "${EDIT_RESPONSE}" | jq .
assert_field_equals "${EDIT_RESPONSE}" "metadata.project_name" "My Upgraded Project"
assert_field_equals "${EDIT_RESPONSE}" "metadata.project_description" "Added description after upgrade"
echo ""
# Step 6: Cleanup - Delete test clone
echo "Step 6: Cleaning up test clone..."
DELETE_RESPONSE=$(grpc_call "ArchitecturalPlanService" "DeleteArchitecturalPlan" '{
"architectural_plan_id": "'"${TEST_CLONE_ID}"'",
"account_id": "'"${TEST_USER_ID}"'"
}' 2>&1 || echo "Cleanup skipped")
echo "✅ Test clone cleaned up"
echo ""
echo "=========================================="
echo "✅ All tests passed!"
echo "=========================================="
Expected Outcomes:
- ✅ Baseline legacy project cloned successfully within test environment
- ✅ Clone detected as legacy (no
project-metadata.json) - ✅ Metadata backfilled with defaults (project name = project ID, empty description)
- ✅ Project now has
project-metadata.json - ✅ User can edit backfilled metadata
- ✅ Edits persist correctly
- ✅ Test clone cleaned up after test
Setup Instructions for Human Admin:
Before running integration tests, a human admin with access to both environments must:
- Identify a legacy project in dev environment (has
plan-metadata.jsonbut noproject-metadata.json) - Copy it to test environment as baseline:
./cli/sdlc/utils/copy-project-between-envs.sh LEGACY-BASELINE-PROJECT \
--source dev \
--target test \
--share ai-swe-agent@codetricks.org - Verify baseline is legacy:
# Should exist
gsutil ls gs://construction-code-expert-test/inputs/architectural-plans/LEGACY-BASELINE-PROJECT/plan-metadata.json
# Should NOT exist
gsutil ls gs://construction-code-expert-test/inputs/architectural-plans/LEGACY-BASELINE-PROJECT/project-metadata.json - Set environment variable for integration tests:
export BASELINE_LEGACY_PROJECT="LEGACY-BASELINE-PROJECT"
Why This Approach:
- ✅ AI agent has no dev access - Human admin handles cross-environment copy
- ✅ Disposable test clones - Each test run creates/destroys its own clone
- ✅ Baseline preserved - Original legacy project remains unchanged
- ✅ Parallel test execution - Multiple tests can run simultaneously with different clones
- ✅ Clean test environment - No leftover test data after completion
Test 3: Bulk Upgrade Legacy Projects (Story 4)
File: tests/integration/test-bulk-upgrade.sh
Prerequisites: Same as Test 2 - requires baseline legacy project in test environment
#!/bin/bash
# Test: Admin bulk upgrade of legacy projects via CLI
# Covers: Story 4 (Bulk Upgrade Legacy Projects - Admin)
source tests/integration/test-setup.sh
set -e
echo "=========================================="
echo "Test: Bulk Upgrade Legacy Projects (Admin)"
echo "=========================================="
echo ""
# Configuration
BASELINE_LEGACY_PROJECT="${BASELINE_LEGACY_PROJECT:-LEGACY-BASELINE-PROJECT}"
TEST_CLONE_1="TEST-BULK-CLONE-1-$(date +%s)"
TEST_CLONE_2="TEST-BULK-CLONE-2-$(date +%s)"
TEST_CLONE_3="TEST-BULK-CLONE-3-$(date +%s)"
echo "📋 Test Configuration:"
echo " Baseline Project: ${BASELINE_LEGACY_PROJECT}"
echo " Test Clones: ${TEST_CLONE_1}, ${TEST_CLONE_2}, ${TEST_CLONE_3}"
echo " Test User: ${TEST_USER_ID}"
echo ""
# Step 1: Create multiple legacy project clones for testing
echo "Step 1: Creating multiple legacy project clones..."
for CLONE_ID in "${TEST_CLONE_1}" "${TEST_CLONE_2}" "${TEST_CLONE_3}"; do
echo "Creating clone: ${CLONE_ID}..."
grpc_call "ArchitecturalPlanService" "CopyArchitecturalPlan" '{
"source_architectural_plan_id": "'"${BASELINE_LEGACY_PROJECT}"'",
"target_architectural_plan_id": "'"${CLONE_ID}"'",
"account_id": "'"${TEST_USER_ID}"'"
}' > /dev/null
echo "✅ Created ${CLONE_ID}"
done
echo ""
# Step 2: List all projects for the user (to find legacy projects)
echo "Step 2: Listing all projects to identify legacy ones..."
LIST_RESPONSE=$(grpc_call "ArchitecturalPlanService" "ListArchitecturalPlanIds" '{
"account_id": "'"${TEST_USER_ID}"'",
"filter": ""
}')
echo "${LIST_RESPONSE}" | jq .
echo ""
# Step 3: Run bulk upgrade CLI command (simulated)
echo "Step 3: Running bulk upgrade CLI command..."
echo "Command: ./cli/codeproof.sh backfill-project-metadata --user-id=${TEST_USER_ID} --dry-run=false"
echo ""
# Simulate CLI command by making individual CreateProjectMetadata calls
for CLONE_ID in "${TEST_CLONE_1}" "${TEST_CLONE_2}" "${TEST_CLONE_3}"; do
echo "Upgrading ${CLONE_ID}..."
grpc_call "ProjectMetadataService" "CreateProjectMetadata" '{
"architectural_plan_id": "'"${CLONE_ID}"'",
"project_name": "'"${CLONE_ID}"'",
"project_description": ""
}' > /dev/null
echo "✅ Upgraded ${CLONE_ID}"
done
echo ""
# Step 4: Verify all projects now have metadata
echo "Step 4: Verifying all projects now have metadata..."
for CLONE_ID in "${TEST_CLONE_1}" "${TEST_CLONE_2}" "${TEST_CLONE_3}"; do
echo "Checking ${CLONE_ID}..."
GET_RESPONSE=$(grpc_call "ProjectMetadataService" "GetProjectMetadata" '{
"architectural_plan_id": "'"${CLONE_ID}"'"
}')
assert_field_equals "${GET_RESPONSE}" "metadata.project_name" "${CLONE_ID}"
echo ""
done
# Step 5: Cleanup - Delete all test clones
echo "Step 5: Cleaning up test clones..."
for CLONE_ID in "${TEST_CLONE_1}" "${TEST_CLONE_2}" "${TEST_CLONE_3}"; do
grpc_call "ArchitecturalPlanService" "DeleteArchitecturalPlan" '{
"architectural_plan_id": "'"${CLONE_ID}"'",
"account_id": "'"${TEST_USER_ID}"'"
}' 2>&1 > /dev/null || echo "Cleanup skipped for ${CLONE_ID}"
done
echo "✅ All test clones cleaned up"
echo ""
echo "=========================================="
echo "✅ All tests passed!"
echo "=========================================="
Expected Outcomes:
- ✅ Multiple legacy project clones created from baseline
- ✅ Bulk upgrade command processes all legacy projects
- ✅ Each project gets
project-metadata.jsonwith defaults - ✅ No data loss (original
plan-metadata.jsonpreserved) - ✅ All projects now accessible via
GetProjectMetadata - ✅ All test clones cleaned up after test
Test 4: Error Handling and Edge Cases
File: tests/integration/test-error-handling.sh
#!/bin/bash
# Test: Error handling and edge cases
# Validates proper error responses for invalid inputs
source tests/integration/test-setup.sh
echo "=========================================="
echo "Test: Error Handling and Edge Cases"
echo "=========================================="
echo ""
# Test 1: Get metadata for non-existent project
echo "Test 1: Get metadata for non-existent project..."
NONEXISTENT_RESPONSE=$(grpc_call "ProjectMetadataService" "GetProjectMetadata" '{
"architectural_plan_id": "DOES-NOT-EXIST"
}' 2>&1 || echo '{"error": "not_found"}')
if echo "${NONEXISTENT_RESPONSE}" | grep -q "not_found\|NotFound\|NOT_FOUND"; then
echo "✅ PASSED: Proper error for non-existent project"
else
echo "❌ FAILED: Expected not_found error"
fi
echo ""
# Test 2: Update metadata with empty project ID
echo "Test 2: Update metadata with empty project ID..."
EMPTY_ID_RESPONSE=$(grpc_call "ProjectMetadataService" "UpdateProjectMetadata" '{
"architectural_plan_id": "",
"project_name": "Test"
}' 2>&1 || echo '{"error": "invalid_argument"}')
if echo "${EMPTY_ID_RESPONSE}" | grep -q "invalid\|INVALID\|required"; then
echo "✅ PASSED: Proper error for empty project ID"
else
echo "❌ FAILED: Expected invalid_argument error"
fi
echo ""
# Test 3: Create metadata for project that already has metadata
echo "Test 3: Create metadata for project that already has metadata..."
DUPLICATE_RESPONSE=$(grpc_call "ProjectMetadataService" "CreateProjectMetadata" '{
"architectural_plan_id": "'"${TEST_PROJECT_ID}"'",
"project_name": "Test"
}' 2>&1 || echo '{"error": "already_exists"}')
if echo "${DUPLICATE_RESPONSE}" | grep -q "already\|ALREADY\|exists\|EXISTS"; then
echo "✅ PASSED: Proper error for duplicate metadata creation"
else
echo "⚠️ WARNING: Expected already_exists error (or idempotent success)"
fi
echo ""
# Test 4: Update with extremely long project name
echo "Test 4: Update with extremely long project name (>1000 chars)..."
LONG_NAME=$(printf 'A%.0s' {1..1500})
LONG_NAME_RESPONSE=$(grpc_call "ProjectMetadataService" "UpdateProjectMetadata" '{
"architectural_plan_id": "'"${TEST_PROJECT_ID}"'",
"project_name": "'"${LONG_NAME}"'"
}' 2>&1 || echo '{"error": "invalid_argument"}')
if echo "${LONG_NAME_RESPONSE}" | grep -q "invalid\|INVALID\|too long\|exceeds"; then
echo "✅ PASSED: Proper error for excessively long project name"
else
echo "⚠️ WARNING: Should validate project name length"
fi
echo ""
# Test 5: PATCH semantics - partial update doesn't overwrite other fields
echo "Test 5: Verifying PATCH semantics (partial update)..."
# First, set both name and description
grpc_call "ProjectMetadataService" "UpdateProjectMetadata" '{
"architectural_plan_id": "'"${TEST_PROJECT_ID}"'",
"project_name": "Full Name",
"project_description": "Full Description"
}' > /dev/null
# Then update only name
UPDATE_PARTIAL_RESPONSE=$(grpc_call "ProjectMetadataService" "UpdateProjectMetadata" '{
"architectural_plan_id": "'"${TEST_PROJECT_ID}"'",
"project_name": "Updated Name Only"
}')
assert_field_equals "${UPDATE_PARTIAL_RESPONSE}" "metadata.project_name" "Updated Name Only"
assert_field_equals "${UPDATE_PARTIAL_RESPONSE}" "metadata.project_description" "Full Description"
echo "✅ PASSED: PATCH semantics work correctly"
echo ""
echo "=========================================="
echo "✅ All error handling tests passed!"
echo "=========================================="
Expected Outcomes:
- ✅ Proper error for non-existent project (NOT_FOUND)
- ✅ Proper error for empty/invalid project ID (INVALID_ARGUMENT)
- ✅ Proper error for duplicate metadata creation (ALREADY_EXISTS or idempotent)
- ✅ Validation for excessively long field values
- ✅ PATCH semantics work correctly (partial updates don't overwrite)
Running the Tests
# Run all integration tests
./tests/integration/run-all-tests.sh
# Run individual test
./tests/integration/test-project-metadata-crud.sh
# Run against Cloud Run deployment
GRPC_HOST=construction-code-expert-dev-856365345080.us-central1.run.app:443 \
./tests/integration/test-project-metadata-crud.sh
Test Coverage Summary
| User Story | Test File | Key Assertions |
|---|---|---|
| Story 1: Edit Project Name | test-project-metadata-crud.sh | Name updates correctly, description unchanged |
| Story 2: Edit Project Description | test-project-metadata-crud.sh | Description updates correctly, name unchanged |
| Story 3: Upgrade Legacy Project | test-legacy-project-upgrade.sh | Legacy detection, backfill, user edits |
| Story 4: Bulk Upgrade (Admin) | test-bulk-upgrade.sh | Multiple projects upgraded, no data loss |
| Error Handling | test-error-handling.sh | Proper errors, validation, PATCH semantics |
Translation to Java/Cypress
Once implementation begins:
Java (Backend Integration Tests):
@Test
public void testUpdateProjectName() {
// Create project
CreateArchitecturalPlanRequest createRequest = ...;
CreateArchitecturalPlanResponse createResponse = service.createArchitecturalPlan(createRequest);
// Update name
UpdateProjectMetadataRequest updateRequest = UpdateProjectMetadataRequest.newBuilder()
.setArchitecturalPlanId(projectId)
.setProjectName("Updated Name")
.build();
UpdateProjectMetadataResponse updateResponse = service.updateProjectMetadata(updateRequest);
// Assert
assertEquals("Updated Name", updateResponse.getMetadata().getProjectName());
}
Cypress (Frontend E2E Tests):
describe('Project Metadata Editing', () => {
it('should update project name', () => {
cy.visit(`/projects/${projectId}/settings`);
cy.get('[data-testid="project-name-edit-button"]').click();
cy.get('[data-testid="project-name-input"]').clear().type('Updated Name');
cy.get('[data-testid="project-name-save-button"]').click();
cy.get('[data-testid="project-name-display"]').should('contain', 'Updated Name');
});
});
Open Questions
- Should we validate occupancy type, construction type, etc. against standard lists?
- Should we support partial updates (PATCH semantics) or require all fields?
- Answer: Yes, use PATCH semantics - only update fields that are explicitly provided
- Should we add a "last modified" indicator in the UI?
- Answer: Yes, add
updated_atandupdated_byfields to track changes
- Answer: Yes, add
- Should we log metadata changes for audit purposes?
- Backfilling: Should we automatically backfill on first access or require manual action?
- Answer: Manual action with clear UI prompt (user-initiated upgrade on project access)
- Backfilling: What default values should we use for legacy projects?
- Answer: Pre-fill project name with
projectId(editable), empty description (optional)
- Answer: Pre-fill project name with
- Backfilling: Should we provide a bulk backfill CLI tool for admins?
- Answer: Yes, as part of MVP. Two workflows: (1) Individual user upgrade via UI banner, (2) Admin bulk upgrade via CLI.
Related Documentation
- Copy Project Utility: Script for copying projects between environments
- Protocol Buffers & gRPC Best Practices: Proto-first design and testing
- Developer Playbook: Build and deployment workflows
Related Issues
- Issue #167: Project Structure Reorganization (file-level metadata)
- Relationship: Complementary - Issue #167 introduces
files/{file_id}/metadata.jsonfor file-level metadata, while this PRD coversproject-metadata.jsonfor project-level metadata - Timeline: This PRD should be implemented first, as it's simpler and provides immediate value
- Compatibility: Both metadata files will coexist and serve different purposes
- Documentation: See File Structure Reorganization PRD
- Relationship: Complementary - Issue #167 introduces
References
- Developer Playbook
- UI Overview
- gRPC Playbook
- File Structure Reorganization PRD
- File Structure Reorganization TDD
- InputFileMetadata Implementation:
src/main/java/org/codetricks/construction/code/assistant/service/InputFileMetadataService.java - Proto Definitions:
src/main/proto/api.proto