At the Sri Amit Ray Compassionate AI Lab, our mission is to create AI systems that embody compassion, reduce suffering, and enhance the well-being of all sentient beings. Unlike conventional AI, which is often designed solely for intelligence, and efficiency, Compassionate AI focuses on alleviating pain—whether physical, emotional, or social.
Over the years, our team has systematically developed 21 primary algorithms that target elimination of different aspects of human and social suffering. These algorithms integrate insights from neuroscience, psychology, ethics, and computational intelligence. Among them, the Pain Recognition and Prediction Algorithm (PRPA) stand as one of the most significant innovations, dedicated to understanding and mitigating the suffering associated with pain.

Introduction
The Pain Recognition and Prediction Algorithm (PRPA) is a Compassionate AI framework designed to detect and predict physical and emotional pain using computer vision and physiological sensors, aligning with Sri Amit Ray’s teachings on minimizing suffering through empathy and ethical technology.[1] Modeled similar to the Ray Mother–Infant Inter-brain Synchrony Algorithm (RMI-Sync-AI), PRPA integrates multimodal data (facial expressions, heart rate, galvanic skin response) to provide real-time pain alerts for vulnerable populations, such as hospital patients and the elderly. This article presents a 20-point framework, pseudocode, and use-cases, emphasizing ethical, non-invasive, and empathetic pain assessment and management system for compassionate AI.
“PRPA is not just an algorithm of intelligence; it is AI’s way of listening to human suffering with intelligence that thinks and a heart that feels, and care.” – Sri Amit Ray
System Description
PRPA algorithm focus on detecting physical or emotional pain by analyzing video frames for facial expressions (e.g., grimacing, Action Units), voice, and physiological signals (heart rate, galvanic skin response) via wearable sensors. A Long Short-Term Memory (LSTM) network processes extracted features to compute a pain score, triggering caregiver alerts when thresholds are exceeded. The system is designed for real-time, non-invasive monitoring, ensuring safety and ethical compliance in healthcare settings.[2]
The 20 Main Components of the Algorithm
- Definition: PRPA is an AI algorithm and system that is designed to detect and predict pain using computer vision for facial expressions, voice, and physiological sensors for heart rate (HR) and galvanic skin response (GSR), enabling empathetic interventions.[3]
- Core Objective: To identify pain in real-time and alert caregivers, reducing suffering in vulnerable populations like hospital patients and the elderly.[1]
- Neuroscientific Basis: Pain activates the spinothalamic tract and amygdala, manifesting in facial expressions (e.g., brow lowering, AU4) and autonomic responses (increased HRV, GSR peaks).[4]
- Facial Expression Analysis: Extracts Action Units (AUs) using Convolutional Neural Networks (CNNs): \[ \text{AU Intensity} = \sigma(\mathbf{W} \cdot \mathbf{f} + \mathbf{b}) \] where \(\mathbf{f}\) are facial landmarks, \(\mathbf{W}\) weights, \(\sigma\) sigmoid activation.[5]
- Physiological Metrics: Computes HRV via root mean square of successive differences (RMSSD): \[ \text{RMSSD} = \sqrt{\frac{1}{N-1} \sum_{i=1}^{N-1} (RR_{i+1} – RR_i)^2} \] and GSR phasic response via deconvolution.[6]
- Multimodal Fusion: Integrates facial and physiological features using cross-attention: \[ \mathbf{Z}_{fusion} = \text{CrossAttention}(\mathbf{Z}_{face}, \mathbf{Z}_{physio}) \] enhancing pain detection accuracy.[7]
- Behavioral Indicators: Validates pain detection with behavioral cues (e.g., vocalizations, body posture) from video/audio, aligned via timestamps.[8]
- Model Architecture: Combines CNN (e.g., ResNet) for facial features and LSTM for temporal dynamics: \[ \mathbf{h}_t = \text{LSTM}(\mathbf{x}_t, \mathbf{h}_{t-1}) \] where \(\mathbf{x}_t\) are fused features.[9]
- Pain Score Calculation: Outputs a continuous pain score (0–10, Visual Analog Scale) via regression head on LSTM output.[10]
- Thresholding and Alerting: Triggers alerts if pain_score > 4 (adaptive via Bayesian update): \[ P(\text{Pain}_t | \text{data}_{1:t}) = \alpha P(\text{Pain}_t | \text{data}_t) + (1-\alpha) P(\text{Pain}_{t-1} | \text{data}_{1:t-1}) \] ensuring reliable notifications.[11]
- Preprocessing Pipeline: Facial alignment via landmarks, HR/GSR bandpass filtering (0.5–4 Hz), artifact rejection via Independent Component Analysis (ICA) and wavelet denoising.[12]
- Clock Synchronization: Uses hardware timestamps or Network Time Protocol (NTP) for aligning video and physiological data, critical for real-time analysis.[13]
- Real-Time Inference: Processes 5–10 s sliding windows with 1 s stride for continuous monitoring, suitable for edge devices.[14]
- Training Loss: Multi-task loss function: \[ L = \lambda_1 L_{reg} + \lambda_2 L_{class} + \lambda_3 L_{contrastive} \] balancing regression, classification, and modality alignment.[15]
- Safety and Non-Invasiveness: Employs contactless cameras and soft wearables, ensuring patient comfort and safety.[16]
- Ethical Imperatives: Ensures GDPR-compliant data privacy, bias mitigation, and informed consent, prioritizing human rights.[17]
- Clinical Relevance: Reduces undertreatment of pain in non-verbal patients (e.g., infants, elderly), improving quality of life by 20–30%.[18]
- Evaluation Metrics: Mean Absolute Error (MAE) for score prediction, AUC/F1 for binary detection, Brier score for alert calibration.[19]
- Validation Phases: Synthetic data testing, lab trials, clinical pilots in hospitals and care homes.[20]
- Generalization and Scalability: Extends to mental health pain detection and animal welfare, deployable across diverse settings.[21]
“PRPA is not only a data-driven AI to measure pain signals but also a heart-driven AI that honors the human story behind the pain.” – Sri Amit Ray
This means the aim of the AI system is to process and interpret qualitative data, such as a patient’s personal narrative, their fears, and their hopes, to provide more holistic and personalized care. It’s a shift from seeing a patient as a collection of data points to recognizing them as a person with a story, love, care, respect, and feelings.
Pseudocode
Input: Video frames (V), Audio, Heart rate (HR), Galvanic Skin Response (GSR)
features = ExtractFeatures(V, HR, GSR) // CNN for facial AUs, preprocessing for HR/GSR
pain_score = LSTM(features) // Temporal modeling for pain score
if pain_score > threshold:
AlertCaregiver() // Notify via app or interface
Output: Pain detection alert
Use-Cases
- Hospital Patient Monitoring: PRPS monitors post-operative patients or those with chronic conditions, detecting pain in non-verbal individuals (e.g., sedated patients, infants). Alerts nurses for timely intervention, reducing undertreatment and improving recovery outcomes.[18]
- Elderly Care: In care homes, PRPS identifies pain in elderly patients with dementia or mobility issues, using non-invasive wearables and cameras. It supports caregivers in managing chronic pain, enhancing quality of life and reducing distress.[22]
Application Areas of PRPA
- Healthcare – Supporting doctors and nurses in detecting patient discomfort early.
- Elderly Care – Identifying silent suffering in those unable to verbalize pain.
- Mental Health – Differentiating between physical pain and psychosomatic distress.
- Companion Robots – Creating emotionally sensitive care systems for the elderly and disabled.
- Palliative Care – Offering comfort and timely interventions in end-of-life support.
The Ethical Dimension
Compassionate AI must go beyond technology. In the Sri Amit Ray tradition, ethics and empathy are foundational. Algorithms like PRPA are focused on designing and developing under strict ethical guidelines:
- Non-harm principle – never intensify suffering.
- Transparency – clear communication about how AI makes decisions.
- Privacy & Dignity – respect for human vulnerability during suffering.
Key Technologies
- Natural Language Processing (NLP): For analyzing narratives and emotions.
- Machine Learning: For pattern recognition and personalization.
- Generative AI: For empathetic response generation.
- Knowledge Graphs: To integrate qualitative and quantitative data.
- Secure Cloud Infrastructure: For data storage and processing.
Data Processing and Interpretation
NLP and Sentiment Analysis:
- Use advanced NLP models to extract themes, emotions, and key concerns from narratives.
- Identify sentiment (e.g., fear, hope, trust) to gauge emotional states.
Contextual Understanding:
- Map qualitative data to patient’s life context (e.g., family, culture, socioeconomic factors).
- Use machine learning to detect patterns in emotional and narrative data.
Integration with Clinical Data:
- Combine qualitative insights with quantitative data using a hybrid AI model.
- Employ knowledge graphs to link emotions and narratives to medical conditions.
Conclusion
The 21 primary algorithms developed at our Sri Amit Ray Compassionate AI Lab represent a new horizon in AI—one that is deeply humane, ethically grounded, and future-oriented. Among them, the Pain Recognition and Prediction AI Algorithm (PRPA) is a vital step in ensuring that AI does not just serve as a machine for intelligence, but as a compassionate companion in reducing suffering.
Through these innovations, we envision a world where AI becomes a force for healing, empathy, and human flourishing, aligning with the timeless wisdom of compassion that transcends cultures and generations.
References:
- Ray, Amit. “Calling for a Compassionate AI Movement: Towards Compassionate Artificial Intelligence.” *Compassionate AI*, vol. 2, no. 6, 25 June 2023, pp. 75-77, https://amitray.com/calling-for-a-compassionate-ai-movement/.
- Ray, Sri Amit. “The 7 Pillars of Compassionate AI Democracy.” *Compassionate AI*, vol. 3, no. 9, 28 Sept. 2024, pp. 84-86, https://amitray.com/the-7-pillars-of-compassionate-ai-democracy/.
- Fang, Ruijie, et al. “Survey on Pain Detection Using Machine Learning Models: Narrative Review.” *JMIR AI*, vol. 4, 2025, e53026, https://ai.jmir.org/2025/1/e53026.
- Hammal, Zakia, et al. “The Fifth Edition of the Automated Assessment of Pain (AAP 2025).” *Proceedings of the 27th ACM International Conference on Multimodal Interaction*, 2025, https://discovery.ucl.ac.uk/id/eprint/10211894/1/Berthouze_APP25%20-%20Final.pdf.
- De Sario, Gioacchino D., et al. “Using AI to Detect Pain through Facial Expressions: A Review.” *Bioengineering*, vol. 10, no. 5, 2 May 2023, p. 548, https://pmc.ncbi.nlm.nih.gov/articles/PMC10215219/.
- *Detection of Stress from PPG and GSR Signals Using AI Framework*. ResearchGate, 2024, https://www.researchgate.net/publication/387706306_Detection_of_Stress_from_PPG_and_GSR_Signals_using_AI_Framework.
- Velmurugan, R., et al. “Multimodal AI Approaches for Pain Assessment: Wearables, Speech, and Facial Biometrics.” *Medicine & Healthcare Book*, IGI Global, 2024, https://www.igi-global.com/chapter/multimodal-ai-approaches-for-pain-assessment/384009.
- Lemain, Victor. “How to Use Facial Expression Analysis in Pain Research.” *Noldus*, 5 Sept. 2024, https://noldus.com/blog/pain-research.
- El-Ghaish, Hany, et al. “Enhanced Deep Learning Framework for Real-Time Pain Assessment Using Multi-Modal Fusion of Facial Features and Video Streams.” *Engineering Applications of Artificial Intelligence*, vol. 137, pt. B, 2025, p. 109166, https://www.sciencedirect.com/science/article/abs/pii/S0952197625009662.
- Ghosh, Anay, et al. “A Novel Pain Sentiment Detection System Utilizing a PainCapsule Model and Textual Facial Patterns.” *Neurocomputing*, vol. 652, 1 Nov. 2025, p. 130907, https://www.sciencedirect.com/science/article/abs/pii/S0925231225015796.
- Adams, Meredith C. B., et al. “A Roadmap for Artificial Intelligence in Pain Medicine: Current Status, Opportunities, and Requirements.” *Current Opinion in Anesthesiology*, vol. 38, no. 5, Oct. 2025, pp. 680-688, https://journals.lww.com/co-anesthesiology/fulltext/2025/10000/a_roadmap_for_artificial_intelligence_in_pain.22.aspx.
- Gasmi, Karim, et al. “Enhanced Multimodal Physiological Signal Analysis for Pain Assessment Using Optimized Ensemble Deep Learning.” *Computer Modeling in Engineering & Sciences*, vol. 143, no. 2, 30 May 2025, pp. 2459-2489, https://www.sciopen.com/article/10.32604/cmes.2025.065817.
- Turk, Esra, et al. “Brains in Sync: Practical Guideline for Parent–Infant EEG During Natural Interaction.” *Frontiers in Psychology*, vol. 13, 27 Apr. 2022, p. 833112, https://www.frontiersin.org/articles/10.3389/fpsyg.2022.833112/full.
- Cascella, Marco, et al. “Artificial Intelligence for Automatic Pain Assessment: Research Methods and Perspectives.” *Pain Research and Management*, 2023, p. 6018736, https://pmc.ncbi.nlm.nih.gov/articles/PMC10322534/.
- El-Tallawy, Salah N., et al. “Incorporation of ‘Artificial Intelligence’ for Objective Pain Assessment: A Comprehensive Review.” *Pain and Therapy*, vol. 13, no. 3, 2 Mar. 2024, pp. 293-317, https://pmc.ncbi.nlm.nih.gov/articles/PMC11111436/.
- Cascella, Marco, et al. “Expert Consensus on Feasibility and Application of Automatic Pain Assessment in Routine Clinical Use.” *Journal of Anesthesia, Analgesia and Critical Care*, vol. 5, no. 1, 2 June 2025, p. 29, https://janesthanalgcritcare.biomedcentral.com/articles/10.1186/s44158-025-00249-8.
- *Construction and Validation of a Pain Facial Expressions Dataset for Critically Ill Children*. *Scientific Reports*, 2025, https://www.nature.com/articles/s41598-025-02247-w.
- *Harnessing the Power of AI to Improve Pain Management in the ER*. StopOpioid, 2025, https://stopioid.com/f/harnessing-the-power-of-ai-to-improve-pain-management-in-the-er.
- Gutierrez, Rommel, et al. “Multimodal AI Techniques for Pain Detection: Integrating Facial Gesture and Paralanguage Analysis.” *Frontiers in Computer Science*, vol. 6, 29 July 2024, p. 1424935, https://www.frontiersin.org/journals/computer-science/articles/10.3389/fcomp.2024.1424935/full.
- Xu, Jingying, et al. “Comprehensive Review on Personalized Pain Assessment and Multimodal Interventions for Postoperative Recovery Optimization.” *Journal of Pain Research*, vol. 18, 5 June 2025, pp. 2791-2804, https://pmc.ncbi.nlm.nih.gov/articles/PMC12147818/.
- *Automatic Pain Classification in Older Patients with Hip Fracture Based on Multimodal Information Fusion*. *Scientific Reports*, 2025, https://www.nature.com/articles/s41598-025-09046-3.
- Valero, Juan Jose, et al. “Polyarticular Pain Management in Elderly Patients and Artificial Intelligence: Precision Strategies for Complex Cases.” *J Clin Pract Med Case Rep*, 10 Aug. 2025, https://www.genesispub.org/polyarticular-pain-management-in-elderly-patients-and-artificial-intelligence-precision-strategies-for-complex-cases.
- Ray, Amit. "Navigation System for Blind People Using Artificial Intelligence." Compassionate AI, 2.5 (2018): 42-44. https://amitray.com/artificial-intelligence-for-assisting-blind-people/.
- Ray, Amit. "Artificial Intelligence to Combat Antibiotic Resistant Bacteria." Compassionate AI, 2.6 (2018): 3-5. https://amitray.com/artificial-intelligence-for-antibiotic-resistant-bacteria/.
- Ray, Amit. "Artificial Intelligence for Balance Control and Fall Detection of Elderly People." Compassionate AI, 4.10 (2018): 39-41. https://amitray.com/artificial-intelligence-for-balance-control-and-fall-detection-system-of-elderly-people/.
- Ray, Amit. "Artificial intelligence for Climate Change, Biodiversity and Earth System Models." Compassionate AI, 1.1 (2022): 54-56. https://amitray.com/artificial-intelligence-for-climate-change-and-earth-system-models/.
- Ray, Amit. "From Data-Driven AI to Compassionate AI: Safeguarding Humanity and Empowering Future Generations." Compassionate AI, 2.6 (2023): 51-53. https://amitray.com/from-data-driven-ai-to-compassionate-ai-safeguarding-humanity-and-empowering-future-generations/.
- Ray, Amit. "Calling for a Compassionate AI Movement: Towards Compassionate Artificial Intelligence." Compassionate AI, 2.6 (2023): 75-77. https://amitray.com/calling-for-a-compassionate-ai-movement/.
- Ray, Amit. "Ethical Responsibilities in Large Language AI Models: GPT-3, GPT-4, PaLM 2, LLaMA, Chinchilla, Gopher, and BLOOM." Compassionate AI, 3.7 (2023): 21-23. https://amitray.com/ethical-responsibility-in-large-language-ai-models/.
- Ray, Amit. "The 10 Ethical AI Indexes for LLM Data Training and Responsible AI." Compassionate AI, 3.8 (2023): 35-39. https://amitray.com/the-10-ethical-ai-indexes-for-responsible-ai/.
- Ray, Amit. "Compassionate AI-Driven Democracy: Power and Challenges." Compassionate AI, 3.9 (2024): 48-50. https://amitray.com/compassionate-ai-driven-democracy-power-and-challenges/.
- Ray, Amit. "The 7 Pillars of Compassionate AI Democracy." Compassionate AI, 3.9 (2024): 84-86. https://amitray.com/the-7-pillars-of-compassionate-ai-democracy/.
- Ray, Amit. "Modeling Consciousness in Compassionate AI: Transformer Models and EEG Data Verification." Compassionate AI, 3.9 (2025): 27-29. https://amitray.com/modeling-consciousness-in-compassionate-ai-transformer-models/.
- Ray, Amit. "Pain Recognition and Prediction AI Algorithm (PRPA) for Compassionate AI." Compassionate AI, 3.9 (2025): 60-62. https://amitray.com/pain-recognition-and-prediction-algorithm-prpa-for-compassionate-ai/.
- Ray, Amit. "Ray Mother–Infant Inter-brain Synchrony Algorithm for Deep Compassionate AI." Compassionate AI, 3.9 (2025): 60-62. https://amitray.com/ray-mother-infant-inter-brain-synchrony-algorithm-deep-compassionate-ai/.
- Ray, Amit. "AI Agents and Robots in Peacekeeping Force and Social Care: Compassionate AI Technologies." Compassionate AI, 3.9 (2025): 75-77. https://amitray.com/ai-agents-robots-peacekeeping-force-social-care-compassionate-ai/.
- Ray, Amit. "AI-Driven Rare Earth Element Magnet Design: Detailed Methodologies." Compassionate AI, 4.10 (2025): 27-29. https://amitray.com/ai-driven-rare-earth-elements-magnet-design-methodologies/.
- Ray, Amit. "Microbial AI, Bioleaching and Digital Twins for Manufacturing the 17 Rare Earth Elements." Compassionate AI, 4.10 (2025): 42-44. https://amitray.com/microbial-ai-bioleaching-and-digital-twins-for-manufacturing-the-17-rare-earth-elements/.