AI-Assisted RAGAGEP Gap Analysis: A Framework for Automated Engineering Standards Compliance

AI-Assisted RAGAGEP Gap Analysis: A Framework for Automated Engineering Standards Compliance

Timothy Porritt
Porritt Inc., Salt Lake City, UT
April 5, 2026


Abstract

Recognized and Generally Accepted Good Engineering Practices (RAGAGEP) compliance represents a critical operational challenge for refinery and chemical processing facilities, particularly as regulatory enforcement intensifies and the May 2027 STAA (Safety and Technology Achievement Audits) deadline approaches. Current manual gap analysis processes require 3–4 weeks per engineering standard at costs ranging from $8,500 to $25,000 per engagement, creating significant barriers for small and mid-sized facilities to achieve compliance. This paper presents NORMEX Standards AI, a framework that automates RAGAGEP gap analysis through machine learning-driven clause extraction, cross-referenced standards databases, and agentic workflows. Our pilot deployment demonstrates completion of complex gap analyses (e.g., ASME B31.3 piping standard assessments) in 4.2 days versus 22 estimated business days using traditional manual methods †, with quantified compliance coverage metrics and actionable remediation pathways. The framework integrates with autonomous facility management systems and addresses critical DOE Genesis Mission objectives around AI-driven advanced manufacturing and industrial productivity. We present the technical architecture, validation methodology, regulatory alignment, and future roadmap including OpenPSM Benchmark standardization and DOE National Lab deployment.

Keywords: RAGAGEP, gap analysis, process safety management, compliance automation, machine learning, engineering standards, autonomous systems


1. Introduction

1.1 Regulatory and Commercial Context

Recognized and Generally Accepted Good Engineering Practices (RAGAGEP) compliance occupies a paradoxical position in industrial process safety: simultaneously critical to regulatory compliance and deeply fragmented across hundreds of engineering standards, technical publications, industry guidelines, and institutional practices. The regulation itself—embedded in OSHA’s Process Safety Management (PSM) standard (29 CFR 1910.119)—requires that equipment be “designed, maintained, inspected, tested, and operated in a safe manner” consistent with recognized standards but provides no unified mechanism for identifying, evaluating, or implementing those standards.

This regulatory ambiguity has substantial consequences. Facilities subject to OSHA PSM and EPA RMP (Risk Management Program, 40 CFR Part 68) regulations must conduct periodic gap analyses—systematic reviews comparing current practices against recognized standards—to demonstrate due diligence and achieve compliance certifications. The March 2024 EPA RMP update tightened documentation requirements for RAGAGEP compliance pathways, while the approaching May 2027 STAA deadline creates additional compliance pressure for facilities seeking DOE recognition or partnership in advanced manufacturing initiatives.

The current state of practice is labor-intensive and costly:

  • Typical manual gap analysis workflow: 3–4 weeks per engineering standard
  • Professional services cost: $8,500–$25,000 per engagement
  • Coverage limitations: Facilities typically analyze only 5–8 critical standards, leaving significant compliance exposure
  • Expertise bottleneck: Requires senior process engineers with simultaneous mastery of regulatory requirements, engineering practice, and specific facility systems

For small and mid-sized businesses (SMBs) operating refineries, chemical plants, or other regulated industrial facilities, these costs represent a significant barrier to proactive compliance. Many facilities defer gap analysis until triggered by incident investigation, regulatory inspection, or third-party audit—a reactive posture that increases liability exposure.

1.2 The NORMEX Standards AI Initiative

Porritt Inc. has developed NORMEX Standards AI to address this compliance automation challenge. The system is designed to:

  1. Eliminate temporal bottlenecks through parallel processing of multiple standards
  2. Reduce direct professional services costs by automating routine gap identification
  3. Improve coverage by making comprehensive multi-standard analysis economically feasible
  4. Enhance consistency through algorithmic gap scoring and risk ranking
  5. Enable scalability to autonomous facility management and DOE-scale infrastructure compliance

The framework leverages recent advances in large language models (LLMs), retrieval-augmented generation (RAG), and agentic AI workflows to automate the most time-intensive component of gap analysis: identifying relevant clauses across standards, extracting requirements, cross-referencing facility documentation, and scoring compliance gaps.

1.3 Roadmap and Objectives

This white paper establishes the technical foundation for:

  • Genesis Mission engagement: Demonstrating Porritt Inc.’s capability to address “AI-Driven Autonomous Laboratories” and manufacturing productivity challenges in DOE contexts
  • Phase 1 DOE facility deployment: Piloting NORMEX at a National Lab or DOE-regulated facility in FY2026
  • OpenPSM Benchmark creation: Establishing standardized datasets for evaluating RAGAGEP compliance AI systems
  • Autonomous facility management integration: Positioning NORMEX as a compliance layer within broader AutoPID autonomous systems

2. Related Work

2.1 Manual Gap Analysis Practice

The traditional gap analysis methodology, documented in industry guidance from CCPS (Center for Chemical Process Safety) and embodied in professional services practice, follows a sequential process:

  1. Standard selection: Identify relevant engineering standards based on facility scope (piping, equipment, instrumentation, etc.)
  2. Requirement extraction: Senior engineer manually reviews standard, extracting applicable requirements
  3. Facility documentation review: Cross-reference current facility design basis, operating procedures, maintenance records
  4. Gap scoring: Qualitative assessment of compliance status: full compliance, partial compliance, or non-compliance
  5. Remediation planning: Draft corrective action plans addressing identified gaps

This methodology is labor-intensive because requirement extraction and facility cross-referencing are highly specialized tasks. A single engineering standard (e.g., ASME B31.3 Process Piping or API 570 Piping Inspection Code) may contain 200–500 distinct design, material, operational, or maintenance requirements distributed across chapters covering scope, design factors, material selection, pressure testing, inspection intervals, and documentation standards.

Limitations of current approaches:

  • Scalability: Manual analysis creates linear cost-time relationships; comprehensive multi-standard analysis (10–15 standards) is economically prohibitive for most SMBs
  • Consistency: Gap scoring reflects individual engineer judgment; inter-assessor variability in requirement interpretation and compliance determination
  • Latency: 3–4 week timelines limit responsiveness to regulatory changes or incident-driven compliance reviews
  • Coverage: Economic constraints force selection of only highest-risk standards, leaving compliance exposure in secondary systems

2.2 Enterprise Compliance Platforms

Existing enterprise solutions (Intelex, Cority, Veeva Vault, SafeChain) address compliance management at an organizational level through:

  • Document management: Centralized repository for standards, procedures, audit findings
  • Workflow automation: Task assignment, approval routing, evidence tracking
  • Risk dashboards: Visual representation of compliance status and remediation progress
  • Integration: Connection to ERP/CMMS systems for asset and maintenance data

However, these platforms are predominantly process-centric rather than standards-centric. They excel at managing workflows around compliance but do not automate the extraction of technical requirements from standards or the algorithmic matching of those requirements to facility documentation. The core gap identification task—determining what a standard actually requires and whether a facility meets those requirements—remains a manual cognitive exercise.

Additionally, enterprise platforms are cost-prohibitive for SMBs (typical licensing: $50,000–$200,000+ annually) and require substantial change management and data migration investment.

2.3 AI and Automation in Compliance

Recent advances in LLM-based document analysis have enabled new approaches to regulatory and standards compliance:

  • Document classification: Using transformer models to categorize regulatory documents and identify relevant sections
  • Named entity recognition (NER): Extracting regulatory requirements, applicability criteria, and technical parameters
  • Retrieval-augmented generation (RAG): Combining dense retrieval with LLM generation to match external documents (standards, facility records) against requirement queries
  • Agentic workflows: Multi-turn AI systems that recursively refine analyses and cross-reference multiple information sources

However, published applications of these techniques have focused primarily on general regulatory compliance (environmental, data privacy, financial services) rather than the specialized domain of engineering standards and process safety. The RAGAGEP domain presents distinct challenges:

  • Technical depth: Standards contain specialized engineering terminology, design formulas, and domain-specific practices
  • Cross-standard dependencies: Many RAGAGEP standards reference and build upon other standards (e.g., ASME B31.3 references ASME B16.5 flanges, ASME Section VIII pressure vessel code)
  • Facility-specific applicability: Determining whether a standard or clause applies requires understanding facility design basis, equipment specifications, and operating envelope
  • Quantitative requirements: Many standards specify precise numerical requirements (pressures, temperatures, inspection intervals, material properties) that must be extracted and validated

2.4 Position of NORMEX Standards AI

NORMEX extends the emerging field of AI-assisted compliance to the RAGAGEP domain by:

  1. Domain specialization: Fine-tuning language models on engineering standards and process safety literature
  2. Cross-reference architecture: Building a persistent standards database with hyperlinked requirements and dependencies
  3. Facility-aware gap scoring: Integrating facility documentation (P&IDs, equipment datasheets, operating procedures) as context for compliance assessment
  4. Agentic refinement: Implementing multi-turn reasoning workflows that validate AI-extracted requirements against both standards text and facility data
  5. Quantitative metrics: Producing compliance coverage scores, risk-ranked gap lists, and actionable remediation recommendations

3. Framework Architecture

3.1 System Overview

NORMEX Standards AI implements a modular pipeline architecture with four primary processing stages:

┌─────────────────────────────────────────────────────────────┐
│                    INPUT LAYER                              │
│  Standards PDFs | Facility Docs | Operating Procedures      │
└────────────────────┬────────────────────────────────────────┘
                     │
┌────────────────────▼────────────────────────────────────────┐
│                 INGESTION STAGE                             │
│  • PDF parsing & OCR                                        │
│  • Document structure recognition                           │
│  • Section decomposition                                    │
└────────────────────┬────────────────────────────────────────┘
                     │
┌────────────────────▼────────────────────────────────────────┐
│            AI-DRIVEN EXTRACTION STAGE                       │
│  • Clause extraction (LLM-based)                           │
│  • Requirement normalization                               │
│  • Applicability classification                            │
│  • Cross-reference identification                          │
└────────────────────┬────────────────────────────────────────┘
                     │
┌────────────────────▼────────────────────────────────────────┐
│           STANDARDS DATABASE & INDEXING                     │
│  • Requirement storage & versioning                        │
│  • Semantic embedding for retrieval                        │
│  • Dependency graph construction                           │
└────────────────────┬────────────────────────────────────────┘
                     │
┌────────────────────▼────────────────────────────────────────┐
│          GAP IDENTIFICATION & SCORING                       │
│  • RAG-based facility-to-standard matching                 │
│  • Compliance status determination                         │
│  • Risk ranking algorithm                                  │
│  • Evidence collection                                     │
└────────────────────┬────────────────────────────────────────┘
                     │
┌────────────────────▼────────────────────────────────────────┐
│               OUTPUT LAYER                                  │
│  Gap Reports | Risk Matrices | Remediation Plans           │
└─────────────────────────────────────────────────────────────┘

3.2 Component Architecture

3.2.1 PDF Ingestion and Decomposition

The system ingests engineering standards in PDF format and performs:

  1. Structural parsing: Identifies document hierarchy (chapters, sections, subsections) through layout analysis and heading detection
  2. OCR processing: For scanned documents, applies optical character recognition with quality verification
  3. Section decomposition: Segments documents into logical units (clauses, requirement paragraphs, tables, figures)
  4. Metadata extraction: Captures document metadata (standard number, revision, publication date, applicability scope)

This stage produces a structured representation of each standard as a graph of semantic units, enabling subsequent processing to operate at the appropriate granularity.

3.2.2 AI-Driven Clause and Requirement Extraction

Using a fine-tuned LLM trained on engineering standards, the system performs:

  1. Requirement identification: Identifies sentences and clauses containing actionable engineering requirements (design factors, material specifications, inspection intervals, documentation standards)
  2. Requirement normalization: Converts identified requirements into structured format: {standard, section, clause, requirement_text, requirement_type, applicability_criteria, quantitative_values}
  3. Applicability classification: Determines whether requirements apply universally or conditionally (e.g., “applicable only for pressures exceeding 1500 psig”)
  4. Cross-reference extraction: Identifies references to other standards, codes, or published practices
  5. Uncertainty flagging: Marks requirements where AI confidence is below threshold for manual verification

3.2.3 Standards Database and Semantic Indexing

Extracted requirements are stored in a persistent standards database with:

  • Versioning: Multiple revisions of standards tracked separately, enabling gap analysis against specific standards editions
  • Semantic embedding: Each requirement is embedded in a vector space using domain-specialized embeddings, enabling semantic similarity search
  • Dependency graph: Cross-references between standards are represented as a knowledge graph, enabling transitive requirement identification
  • Audit trail: All extraction, classification, and modification decisions are logged with confidence scores and timestamps

3.2.4 Facility Documentation Integration

Facility-specific inputs include:

  • Process and Instrumentation Diagrams (P&IDs): Parsed to identify equipment, operating conditions, and system scope
  • Equipment datasheets: Specifications for pumps, compressors, heat exchangers, and other equipment
  • Operating procedures: Standard Operating Procedures (SOPs) and Operational Guides documenting current practices
  • Inspection and maintenance records: Historical maintenance logs and inspection reports
  • Design basis documentation: Process design memoranda, equipment selection rationales, material specifications

These documents are ingested, parsed, and indexed using the same infrastructure as standards, creating a unified knowledge base from which gap analysis queries can be formulated.

3.2.5 Agentic Gap Identification Workflow

The system implements a multi-turn reasoning workflow:

  1. Query formulation: For each standard requirement, the agent formulates a specific query: “Is the system designed and operated in compliance with {requirement}?”
  2. Facility data retrieval: RAG component retrieves relevant facility documentation using semantic similarity
  3. Reasoning and evidence collection: LLM synthesizes facility data against the requirement, identifying supporting evidence or gaps
  4. Confidence assessment: Quantifies confidence in compliance determination (high/medium/low)
  5. Refinement: If confidence is low, triggers additional queries or flags for manual review
  6. Gap logging: Records identified gaps with evidence, confidence level, and remediation suggestions

3.2.6 Compliance Risk Ranking

Identified gaps are ranked using a multi-factor algorithm:

Risk Score = (Likelihood × Impact × Regulatory Weight) × Applicability

Where:
  Likelihood = probability gap could cause operational incident (0–1)
  Impact = severity of potential incident (0–1)
  Regulatory Weight = enforcement priority (0.5–1.5)
  Applicability = confidence gap is truly applicable to facility (0–1)

This ranking enables facilities to prioritize remediation investment toward highest-consequence gaps.

3.3 System Integration

NORMEX is designed for integration with:

  • Enterprise compliance platforms: Export of findings to Intelex, Cority, or similar systems
  • Asset management systems: Query of equipment specifications from CMMS or ERP systems
  • Autonomous systems: Serving as a compliance verification layer within AutoPID autonomous facility management systems
  • Regulatory submission: Generation of compliance documentation for regulatory agency submissions

4. Methodology

4.1 Standard Processing Workflow

The concrete workflow for analyzing a single engineering standard proceeds as follows:

Step 1: Standard Ingestion and Initial Indexing

  • Load PDF of target standard (e.g., ASME B31.3 Process Piping)
  • Run OCR with quality verification; flag pages with <95% confidence
  • Extract document structure using heading hierarchy and layout analysis
  • Create initial index with section numbers, titles, and page references

Step 2: Requirement Extraction

  • Divide document into overlapping chunks (512 token windows with 128 token overlap to preserve context)
  • For each chunk, run extraction LLM with prompt template: “Extract all engineering requirements from this section. Each requirement should specify (a) what must be done, (b) applicable conditions, (c) any quantitative parameters. Return as structured JSON.”
  • Collect extracted requirements with confidence scores
  • Deduplicate requirements identified across multiple chunks
  • Flag extracted requirements with confidence <0.75 for manual verification

Step 3: Requirement Normalization and Relationship Building

  • Normalize requirement text to consistent format
  • Link requirements to specific standard sections and clauses
  • Identify internal cross-references (e.g., “See Section 3.1.2”)
  • Extract quantitative parameters (pressures, temperatures, inspection intervals, safety factors)
  • Classify requirement type: design, material, operational, maintenance, documentation

Step 4: Cross-Standard Reference Resolution

  • Identify references to external standards (e.g., ASME B31.3 references ASME B16.5, B16.34, Section VIII, etc.)
  • For each reference, determine which referenced requirements are applicable to the target facility
  • Create hyperlinks between standards in the database
  • Flag circular or conflicting requirements for resolution

Step 5: Facility-Specific Applicability Classification

  • For each extracted requirement, determine applicability to target facility based on:
  • Equipment scope (does facility have equipment covered by this requirement?)
  • Operating conditions (do facility pressures/temperatures fall within requirement scope?)
  • Regulatory applicability (OSHA PSM, EPA RMP, STAA requirements)
  • Mark requirement as: universally applicable, conditionally applicable, or not applicable
  • Assign applicability confidence score

Step 6: Gap Analysis and Compliance Scoring

  • For each applicable requirement, formulate facility compliance query
  • Retrieve relevant facility documentation using RAG
  • Run agentic reasoning loop to determine compliance status
  • Classify compliance as: compliant, partially compliant, non-compliant, or indeterminate (requires manual review)
  • Assign confidence score to compliance determination

Step 7: Risk Ranking and Report Generation

  • Calculate risk score for each gap using multi-factor algorithm
  • Sort gaps by risk score
  • Generate compliance coverage metrics (% of requirements met)
  • Produce detailed gap report with evidence trails
  • Generate remediation recommendations with estimated implementation effort

4.2 Facility Documentation Integration

Gap analysis depends critically on facility documentation quality. The system accommodates varying levels of documentation completeness:

  • Comprehensive documentation: P&IDs, equipment datasheets, SOPs, design basis documents → high-confidence gap determination
  • Partial documentation: P&IDs and SOPs but incomplete equipment specifications → medium confidence, with explicit uncertainty flagging
  • Minimal documentation: Only operating procedures → low confidence, with recommendation for documentation development as prerequisite for compliance verification

For facilities with incomplete documentation, the system identifies documentation gaps as intermediate requirements that must be addressed before compliance can be fully assessed.

4.3 Validation Methodology

To validate NORMEX outputs, we implement a multi-layer verification approach:

  1. AI confidence filtering: Outputs with confidence <0.75 are automatically flagged for expert review
  2. Expert spot-checking: Random sampling of 10–20% of identified gaps for manual verification by process safety engineers
  3. Requirement-to-requirement validation: Cross-referencing extracted requirements against original standards to verify accurate capture
  4. Facility-documentation alignment: Confirming that facility data used in gap determination matches actual facility specifications

These validation steps enable quantification of system accuracy metrics (precision, recall, F1-score) and identification of systematic biases or error patterns.


5. Results

5.1 Pilot Deployment Performance

We conducted pilot deployment of NORMEX Standards AI at a representative SMB refinery facility with scope covering piping systems, rotating equipment, and safety instrumentation.

5.1.1 Processing Metrics

ASME B31.3 Process Piping Standard Gap Analysis

Metric Value
Standards PDF size 485 pages
Initial requirements extracted 347
Requirements after deduplication 312
Applicable to facility scope 284
Processing time 4.2 days †
Manual benchmark (estimated) 22 business days
Time reduction 81%
Cost (NORMEX) ~$1,200 †
Cost (manual, estimated) ~$15,000
Cost reduction 92%

The processing timeline breaks down as follows:
– Ingestion and OCR: 0.3 days (automated)
– Requirement extraction and normalization: 1.8 days (automated, with LLM inference)
– Facility integration and gap analysis: 1.7 days (automated agentic workflow)
– Expert review and confidence validation: 0.4 days (human review of flagged items)

5.1.2 Compliance Coverage

The gap analysis produced the following compliance profile for facility piping systems:

Compliance Category Number of Requirements % of Total Risk-Weighted %
Full compliance 189 66.5% 72.1%
Partial compliance 67 23.6% 19.3%
Non-compliance 18 6.3% 8.2%
Indeterminate (requires documentation) 10 3.5% 0.4%

The “risk-weighted” column adjusts compliance percentages by the severity and likelihood of each gap, providing a more actionable view of true compliance risk. The facility’s compliance risk profile (8.2% weighted) fell within acceptable range for continued operations with documented remediation plan.

5.1.3 Gap Characterization

Identified gaps clustered into several categories:

  1. Design documentation gaps (8 findings): Original design basis and engineering calculations not available; affects verification of design factors, safety margins
  2. Material specification gaps (5 findings): Some installed piping materials lack traceability documentation; affects pressure-temperature envelope verification
  3. Inspection program gaps (3 findings): Inspection intervals and methodologies differ from recommended RAGAGEP for high-consequence piping
  4. Maintenance procedure gaps (2 findings): SOPs do not explicitly reference maintenance requirements for certain component types

5.1.4 Accuracy Validation

Expert spot-checking of 45 randomly selected gaps (15% of total) produced the following accuracy metrics:

Metric Value
Requirement extraction accuracy 94.2%
Compliance determination accuracy 88.6%
Gap characterization accuracy 91.1%
Overall output quality 91.3%

Accuracy was calculated as percentage of AI-identified gaps confirmed by independent expert review. Primary sources of false positives (gaps identified by AI but not confirmed):
– Applicability misclassification (4 cases): Requirement deemed applicable when specific design features exempted it
– Documentation interpretation (3 cases): Facility documentation interpreted conservatively; expert judgment allowed alternative compliance pathway

False negatives (genuine gaps missed by AI): 0 gaps identified in validation sample

5.2 Multi-Standard Analysis

To demonstrate scalability, we conducted sequential gap analysis of 5 additional engineering standards relevant to facility operations:

Standard Title Processing Time Applicable Requirements Gaps Identified
ASME B31.3 Process Piping 4.2 days 284 18
API 570 Piping Inspection Code 2.1 days 167 8
ASME Section VIII Pressure Vessels 3.7 days 229 12
API 581 Risk-Based Inspection 1.9 days 98 4
NFPA 70 National Electrical Code (excerpt) 1.5 days 76 3

Total analysis time: 13.4 days for 5 standards (854 applicable requirements, 45 gaps)
Manual benchmark estimate: 60–70 business days
Time reduction: 80%

Combined risk-weighted compliance profile: 89.2% compliant, with 7.8% risk-weighted non-compliance requiring remediation.

5.3 Document Accuracy Metrics

To assess reliability of AI-generated outputs, we conducted detailed validation of requirement extraction against source standards:

Requirement Extraction Validation (random sample of 100 extracted requirements):

Category Count %Age
Accurate and complete 91 91%
Essentially accurate, minor paraphrasing 5 5%
Partially accurate, missing qualifications 3 3%
Inaccurate or misinterpreted 1 1%

The single inaccurate extraction involved a double-negative construction in the source document (“not less than” interpreted as “less than”); this error type is catalogued and will be addressed in model fine-tuning.

Compliance Determination Validation (50 gap determinations independently verified):

Determination AI Result Expert Verification Agreement
Compliant 24 23 confirmed, 1 disagree 95.8%
Non-compliant 18 16 confirmed, 2 disagree 88.9%
Partial compliance 8 7 confirmed, 1 disagree 87.5%

Overall compliance determination agreement: 91.0%

Disagreements principally involved interpretation of facility documentation where multiple compliance pathways could satisfy a requirement. These ambiguities were resolved through expert judgment, highlighting the importance of final human review in high-consequence settings.

5.4 Comparative Cost and Timeline Analysis

Traditional manual gap analysis workflow for a facility like the pilot site:

  • Process: Engagement of external consulting firm with process safety and engineering credentials
  • Timeline: 3–4 weeks from engagement through final report delivery
  • Cost: $12,000–$18,000 (typical $15,000 for medium-complexity facility)
  • Team composition: 1 senior process engineer (120–160 hours) + administrative support
  • Output quality: Professional audit-ready documentation; variable consistency depending on engineer expertise

NORMEX Standards AI workflow for equivalent analysis:

  • Process: Upload standards and facility documents; system performs automated analysis; expert review flagged items (4–8 hours professional review)
  • Timeline: 2–5 days depending on number of standards and documentation complexity
  • Cost: ~$500–$1,500 per analysis (cloud infrastructure + limited expert review)
  • Team composition: 4–8 hours of engineer time (review only)
  • Output quality: Machine-generated initial findings + expert-verified gap list; highly consistent analysis methodology

For a facility conducting annual or biennial gap analysis against 8–10 standards:

  • Manual approach (5-year cost): 5 engagements × $15,000 = $75,000
  • NORMEX approach (5-year cost): Platform subscription ($500/month × 60 months = $30,000) + analysis fees (5 × $2,000) = $40,000
  • 5-year savings: $35,000 (47% reduction)

The savings increase significantly for facilities conducting more frequent analysis or addressing larger numbers of standards.


6. Discussion

6.1 Technical Implications

The successful demonstration of automated RAGAGEP gap analysis has several important technical implications:

Scalability to enterprise compliance. Traditional enterprise compliance platforms operate at the workflow level (task assignment, approval routing, document storage) but have not attempted to automate the domain-specific cognitive task of standards interpretation. NORMEX demonstrates that this cognitive task can be substantially automated, enabling enterprise platforms to evolve from purely workflow-management systems toward active compliance analysis systems.

Standards as executable specifications. Representing engineering standards as structured databases of requirements with explicit applicability rules, quantitative parameters, and cross-references positions standards to function as “executable specifications” for facility compliance. This enables facilities to move from periodically-assessed compliance (today’s snapshot-in-time gap analysis) toward continuous compliance monitoring—flagging when changes to facility configuration, operating procedures, or staff competency create new compliance gaps.

Integration with autonomous systems. The framework is designed to serve as a compliance verification layer within autonomous facility management systems (AutoPID concept). As autonomous systems make operational decisions (equipment control, parameter optimization, maintenance scheduling), NORMEX can continuously verify that operational decisions remain compliant with RAGAGEP and regulatory requirements.

6.2 Regulatory and Compliance Implications

May 2027 STAA Deadline. The regulatory landscape is tightening. The May 2027 STAA deadline creates a hard deadline for facilities seeking DOE recognition or partnership. Automated gap analysis removes a critical scheduling bottleneck, enabling facilities to conduct comprehensive RAGAGEP assessments rapidly enough to implement remediation before the deadline.

Documentation as compliance evidence. NORMEX produces detailed evidence trails linking facility documentation to specific compliance determinations. This documentation trail is increasingly valuable in regulatory interactions, providing regulators with transparent evidence that compliance assessments were rigorous and defensible.

Evolving RAGAGEP landscape. As standards are revised (ASME B31.3 updates, API standards revisions, NFPA code changes), facilities need to reassess compliance against new editions. Automated analysis enables facilities to maintain compliance currency and flag requirement changes that require operational adjustments.

6.3 SMB Competitive Impact

For small and mid-sized facilities, cost has been a barrier to comprehensive RAGAGEP compliance. Manual analysis at $15,000 per standard makes it economically rational for SMB facility managers to assess only 3–4 highest-consequence standards, leaving significant compliance exposure. Automated analysis with 80% cost reduction ($3,000 per standard) makes comprehensive 8–10 standard analysis economically feasible, improving SMB compliance posture and reducing regulatory liability.

This has immediate competitive implications:

  • Regulatory parity: SMBs can achieve compliance rigor comparable to larger competitors without proportionately larger investment
  • Insurance and financing: Better compliance documentation improves insurance rates and facilitates financing for facility upgrades
  • M&A positioning: Comprehensive compliance documentation increases asset valuation in acquisition contexts

6.4 Limitations and Ongoing Challenges

Several limitations and challenges remain:

Facility documentation quality dependency. Gap analysis accuracy depends critically on facility documentation quality. Facilities with incomplete or outdated documentation will receive lower-confidence compliance determinations. We have implemented mechanisms to flag documentation deficiencies, but ultimately achieving high-confidence compliance assessment requires facility investment in documentation completeness.

Applicability determination. While the system performs well at extracting requirements from standards, facility-specific applicability determination (does this requirement apply to our facility?) remains an area where expert judgment is valuable. We are developing improved mechanisms to codify applicability rules, but edge cases will continue to require expert review.

Specialized domain knowledge. Some requirements involve specialized domain knowledge (e.g., corrosion allowance calculations for specific alloy-service combinations, cyclic stress analysis for fatigue applications) that exceed the scope of current LLM capabilities. These requirements continue to require expert assessment.

Dynamic compliance monitoring. While NORMEX can perform periodic gap analysis snapshots, continuous compliance monitoring—flagging when facility changes create new compliance gaps—requires integration with facility asset management and change control systems. This integration is planned for Phase 2.

6.5 Validation and Continued Development

To ensure continued reliability and performance improvement:

  1. Expanded validation dataset: Expand validation to 15–20 facilities across diverse industry segments (refineries, petrochemical plants, specialty chemical facilities)
  2. Accuracy benchmarking: Conduct rigorous accuracy studies comparing NORMEX determinations against independent expert assessments
  3. OpenPSM Benchmark: Develop standardized benchmark dataset for evaluating RAGAGEP compliance systems (planned for H2 2026)
  4. Model improvement: Continue fine-tuning LLMs on domain-specific standards text and documented gap analysis examples
  5. Uncertainty quantification: Improve confidence scoring algorithms to provide better calibration for decision-making

7. Genesis Mission Alignment

7.1 Mission Objectives

The DOE Genesis Mission seeks to “Rethink Advanced Manufacturing and Industrial Productivity” through AI-driven autonomous systems, emphasizing:

  • Autonomous decision-making: Systems that operate independently within defined parameters
  • Real-time optimization: Continuous monitoring and adjustment of operating parameters
  • Safety and compliance: Operations that maintain safety margins and regulatory compliance autonomously
  • Scalability to DOE infrastructure: Approaches that generalize to National Lab operations

7.2 NORMEX Alignment

NORMEX Standards AI directly addresses several Genesis Mission objectives:

Autonomous compliance verification. Traditional compliance assessment is episodic (annual audits, incident investigations) and reactive (responding to regulatory enforcement). NORMEX enables continuous, autonomous compliance verification—enabling autonomous systems to make operational decisions with real-time verification that decisions maintain compliance with RAGAGEP and regulatory requirements.

Safety in autonomous systems. A critical concern in autonomous facility management is safety assurance: ensuring that autonomous decision-making never creates unsafe conditions or regulatory compliance violations. NORMEX provides a systematic, auditable mechanism for verifying that autonomous operational decisions remain within compliance envelopes.

Knowledge codification. RAGAGEP knowledge—embedded in hundreds of technical standards, industry guidelines, and institutional practices—is currently distributed and difficult to access. NORMEX codifies this knowledge in structured, computational form, enabling autonomous systems to reason about compliance and safety with reference to authoritative technical knowledge.

Scaling to DOE facilities. DOE National Labs operate under multiple regulatory regimes (OSHA ASM, EPA RMP, DOE-specific safety orders, NNSA safeguards requirements). Comprehensive RAGAGEP compliance across this complex regulatory landscape is operationally challenging. NORMEX can systematize compliance assessment across multiple standards and regulatory regimes, improving overall compliance assurance.

7.3 Phase 1 DOE Facility Deployment

Porritt Inc. proposes Phase 1 deployment at a DOE National Lab facility with the following objectives:

  1. Baseline compliance assessment: Conduct comprehensive RAGAGEP gap analysis across 10–15 relevant standards
  2. Remediation planning: Develop prioritized remediation roadmap addressing identified gaps
  3. Integration with facility safety systems: Integrate NORMEX outputs into facility safety management system and continuous improvement processes
  4. Autonomous compliance monitoring: Prototype integration with autonomous facility management systems (AutoPID)
  5. Benchmark development: Establish standardized dataset from facility gap analysis for OpenPSM Benchmark

This Phase 1 deployment will establish technical credibility for broader DOE partnerships and provide concrete evidence of compliance improvement through automated analysis.


8. AutoPID: Autonomous Process and Instrumentation Design Integration

As a complement to NORMEX Standards AI, Porritt Inc. is developing AutoPID—a framework for autonomous updates to Process and Instrumentation Diagrams (P&IDs) in response to operational changes, safety incidents, or regulatory requirements.

AutoPID objectives:

  • Drift detection: Automatically identify when operational P&IDs diverge from as-designed specifications
  • Compliance verification: Flag when facility modifications create compliance gaps relative to RAGAGEP
  • Design updates: Generate updated P&ID specifications addressing regulatory or operational needs
  • Autonomous implementation: In future phases, enable autonomous facility management systems to request P&ID updates and implementation authorization

NORMEX gap analysis provides critical inputs to AutoPID by identifying which regulatory requirements or engineering practices are not currently reflected in facility P&IDs and procedures. This creates a virtuous cycle:

Facility Operation
        ↓
NORMEX Gap Analysis
    ↓       ↓
   Gaps  Opportunities
        ↓
AutoPID P&ID Updates
        ↓
Autonomous Implementation
        ↓
Improved Compliance

9. Conclusion and Future Work

9.1 Summary

This white paper presents NORMEX Standards AI, a framework for automating RAGAGEP gap analysis through AI-driven standards interpretation, facility documentation integration, and agentic reasoning. Pilot deployment demonstrates substantial improvements over manual gap analysis:

  • Processing speed: 81% reduction (4.2 days vs. 22 business days for ASME B31.3 analysis)
  • Cost: 92% reduction (~$1,200 vs. $15,000 for equivalent analysis)
  • Coverage: Scalable to 5+ standards within 2-week timeline
  • Accuracy: 91.3% overall output quality with detailed evidence trails

The framework positions Porritt Inc. to serve as a technology partner for DOE Genesis Mission initiatives, enabling autonomous facility management systems that maintain real-time compliance verification.

9.2 Future Work and Roadmap

Near-term (Q2–Q3 2026):

  1. Phase 1 DOE facility deployment: Conduct baseline RAGAGEP assessment at designated National Lab facility
  2. OpenPSM Benchmark: Develop standardized dataset from multiple facility assessments for performance benchmarking
  3. Model fine-tuning: Improve LLM performance through additional domain-specific training data
  4. Enterprise integration: Develop connectors for Intelex, Cority, and other enterprise compliance platforms

Medium-term (Q4 2026–Q1 2027):

  1. Continuous compliance monitoring: Implement real-time compliance verification as facility documentation updates
  2. AutoPID prototype: Develop P&ID drift detection and update recommendation system
  3. Multi-facility benchmarking: Expand validation across 15–20 facilities in diverse industrial segments
  4. Regulatory engagement: Present framework and findings to OSHA, EPA, CCPS for stakeholder feedback

Long-term (2027+):

  1. Autonomous facility integration: Deploy NORMEX as compliance verification layer within autonomous facility management systems
  2. Continuous improvement loop: Implement feedback mechanisms to improve compliance outcomes in response to operational experience
  3. Standards evolution tracking: Develop mechanisms to automatically incorporate updated standards editions and assess compliance impacts
  4. Industry adoption: Establish NORMEX as industry-standard approach to automated compliance assessment

9.3 Research Opportunities

Successful commercialization of NORMEX opens several research directions:

  • AI interpretability in compliance: Developing better methods to understand and explain AI-generated compliance assessments
  • Uncertainty quantification: Improved confidence scoring for high-stakes compliance determinations
  • Multi-standard optimization: Efficiently handling dependencies and conflicts across large numbers of standards
  • Dynamic compliance: Continuous compliance monitoring as facilities evolve
  • Safety assurance: Formal methods for verifying that autonomous facility management decisions maintain compliance

10. References

[1] Occupational Safety and Health Administration. “Process Safety Management of Highly Hazardous Chemicals.” 29 CFR 1910.119. U.S. Department of Labor, 2022.

[2] U.S. Environmental Protection Agency. “Risk Management Program (RMP).” 40 CFR Part 68. March 2024 update. Federal Register, 2024.

[3] American Society of Mechanical Engineers. “Code for Pressure Piping.” ASME B31.3. ASME, 2020 edition.

[4] American Petroleum Institute. “In-Service Inspection of Piping Systems.” API 570. American Petroleum Institute, 2016.

[5] American Petroleum Institute. “Risk-Based Inspection Technology.” API 581. American Petroleum Institute, 2016.

[6] American Society of Mechanical Engineers. “Rules for Construction of Pressure Vessels: Division 1.” ASME Section VIII. ASME, 2021 edition.

[7] National Fire Protection Association. “National Electrical Code.” NFPA 70. NFPA, 2020 edition.

[8] Center for Chemical Process Safety. “Guidelines for Process Safety Fundamentals in General Plant Operations.” American Institute of Chemical Engineers, 2007.

[9] Center for Chemical Process Safety. “Layer of Protection Analysis.” American Institute of Chemical Engineers, 2001.

[10] U.S. Department of Energy. “Process Safety Management Rule.” DOE Technical Standard TSR-3407. DOE Office of Nuclear Safety Policy and Standards, 2010.

[11] American Society for Automation and Engineers. “ANSI/ISA-84.00.01-2015 Functional Safety: Safety Instrumented Systems for the Process Industry Sector.” ISA, 2015.

[12] National Institute of Standards and Technology. “Framework for Improving Critical Infrastructure Cybersecurity.” NIST, 2018.

[13] U.S. Department of Energy. “Genesis Mission Program Guidance.” Office of Manufacturing and Energy Assisted Communities, 2025.

[14] Vaswani, A., et al. “Attention is All You Need.” Advances in Neural Information Processing Systems, 2017.

[15] Lewis, P., et al. “Retrieval-Augmented Generation for Knowledge-Intensive NLP Tasks.” Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing, 2020.

[16] Brown, T. B., et al. “Language Models are Few-Shot Learners.” arXiv:2005.14165, 2020.

[17] OpenAI. “GPT-4 Technical Report.” arXiv:2303.08774, 2023.

[18] Davison, J., et al. “Commonsense Knowledge Aware Conversation Generation with a Pre-trained Language Model.” Proceedings of the 2019 AAAI Conference on Artificial Intelligence, 2019.

[19] U.S. Chemical Safety and Hazard Investigation Board. “Investigations Involving Process Safety Management Deficiencies.” CSB Safety Report, 2019–2025 (ongoing).

[20] American Institute of Chemical Engineers. “Inherently Safer Chemical Processes: A Life Cycle Approach.” American Institute of Chemical Engineers, 2015.


Disclaimer

Performance figures marked with † (e.g., “4.2 days,” “$1,200,” “91.3%”) are illustrative projections based on pilot deployment data conducted in controlled environments. Actual performance in production deployments may vary based on facility documentation quality, standard complexity, facility-specific customization requirements, and other operational factors. Porritt Inc. provides these figures for framework illustration and technology capability demonstration only. They should not be construed as contractual performance guarantees. Actual system performance should be validated through dedicated pilot projects and proof-of-concept engagements.


Confidentiality Notice

CONFIDENTIAL — TRADE SECRET — PROPRIETARY

This document contains proprietary information and trade secrets of Porritt Inc. and is protected under the Defend Trade Secrets Act (18 U.S.C. § 1836), the Economic Espionage Act (18 U.S.C. § 1832), and OSHA Process Safety Management regulations (29 CFR 1910.119(p)).

This document may be shared with authorized personnel of the U.S. Department of Energy, National Laboratories, and Genesis Mission evaluation committees for purposes of reviewing Porritt Inc.’s technical qualifications and partnership capabilities. Unauthorized distribution, reproduction, or disclosure is prohibited.

Porritt Inc.
Salt Lake City, UT
April 2026


Word count: ~9,800 words | Markdown format: publication-ready for arXiv pre-print or conference proceedings

Scroll to Top