Partner Technology Assessment
Partner technology assessment evaluates an organisation’s IT infrastructure, security controls, and operational capabilities to determine readiness for technical collaboration. You conduct this assessment before granting system access, establishing data sharing arrangements, or integrating partner systems with your infrastructure.
The assessment produces a structured evaluation of partner capabilities against your organisation’s minimum requirements, identifies gaps requiring remediation, and assigns a risk rating that informs partnership decisions. Assessment depth scales with the sensitivity of planned collaboration: sharing aggregated programme data requires less scrutiny than granting access to beneficiary case management systems.
Prerequisites
Before beginning assessment, confirm these requirements are in place:
| Requirement | Detail | Verification |
|---|---|---|
| Partnership agreement | MoU or contract establishing collaboration scope | Document reference number |
| Executive sponsor | Named individual authorising assessment | Email confirmation |
| Partner contact | Technical contact with authority to respond | Name, role, email, phone |
| Assessment scope | Systems, data, and access types planned | Written scope statement |
| Timeline | Assessment deadline and decision date | Calendar dates confirmed |
| Resources | Assessor availability (8-40 hours depending on scope) | Resource allocation confirmed |
Gather partnership context documents including the proposed data sharing agreement, integration requirements, and any donor requirements that mandate partner assessments. USAID, FCDO, and EU funding mechanisms often specify minimum partner security standards that inform your assessment criteria.
Conflict of interest
Do not assess partners when you have financial interest in the assessment outcome or existing personal relationships with partner staff. Declare conflicts to your supervisor and assign alternative assessors.
Assessment Scoping
Assessment scope determines depth, duration, and methodology. Scope derives from planned collaboration type and data sensitivity.
Collaboration tiers define baseline assessment requirements:
| Tier | Collaboration type | Data sensitivity | Assessment depth | Typical duration |
|---|---|---|---|---|
| 1 | Aggregated data sharing | Non-personal | Questionnaire only | 2-4 hours |
| 2 | System access (read) | Personal, non-sensitive | Questionnaire + documentation review | 8-16 hours |
| 3 | System access (write) | Personal, non-sensitive | Full assessment | 16-24 hours |
| 4 | Data sharing (personal) | Sensitive personal data | Full assessment + technical verification | 24-40 hours |
| 5 | Protection/safeguarding | Survivor data, child protection | Full assessment + on-site verification | 40+ hours |
A partner receiving read-only access to your programme dashboard (Tier 2) requires questionnaire completion and review of their acceptable use policy. A partner who will enter beneficiary data into your case management system (Tier 3) requires full assessment including infrastructure and security evaluation. A partner handling protection case referrals (Tier 5) requires the most rigorous assessment including verification of physical security and staff vetting procedures.
Document scope in writing before proceeding:
PARTNER TECHNOLOGY ASSESSMENT SCOPE
Partner organisation: [Name]Assessment date: [YYYY-MM-DD]Assessor: [Name, role]
Planned collaboration:- [Specific system access / data sharing / integration]- Data types: [Personal/non-personal, sensitivity level]- Duration: [Temporary/ongoing]
Assessment tier: [1-5]Assessment components:- [ ] Questionnaire- [ ] Documentation review- [ ] Infrastructure assessment- [ ] Security assessment- [ ] Application assessment- [ ] Technical verification- [ ] On-site verification
Timeline:- Questionnaire sent: [Date]- Response deadline: [Date]- Assessment complete: [Date]- Decision required: [Date]
Approved by: [Name, role, date]Procedure
Send the assessment questionnaire to the partner technical contact with clear instructions and deadline.
Use the embedded template at the end of this page. Customise the questionnaire by removing sections irrelevant to your scope tier. A Tier 1 assessment (aggregated data only) requires only sections A (Organisation Profile) and B (Data Handling). A Tier 4 assessment requires all sections.
Set response deadline 10-15 working days from send date. Shorter deadlines produce incomplete responses; longer deadlines lose momentum.
Subject: Technology Assessment Questionnaire - [Partnership Name]
Dear [Contact name],
As part of establishing our partnership for [collaboration description], we require completion of the attached technology assessment questionnaire.
Please complete all highlighted sections and return by [date]. Where questions are not applicable to your organisation, note "N/A" with brief explanation.
We are happy to schedule a call to discuss any questions about the assessment process or specific questionnaire items.
Required attachments: - Information security policy (or equivalent) - Data protection/privacy policy - [Additional documents per scope]
Please direct questions to [assessor email].Review the completed questionnaire for completeness and initial red flags.
Check every required field contains a response. Mark incomplete sections for follow-up. Flag responses requiring clarification or additional evidence.
Initial red flags requiring immediate attention:
Red flag Indicator Action No security policy ”None” or “In development” for security policy question Cannot proceed to Tier 3+ without remediation No data protection policy Missing or “Not applicable” Cannot proceed with personal data sharing Shared credentials ”Yes” to shared account question Requires remediation plan No backup ”None” or “Manual only” for backup question Risk escalation for data sharing No encryption ”No” to encryption at rest or in transit Cannot share sensitive data Request missing information before proceeding to detailed assessment. Send a single consolidated request listing all gaps rather than multiple follow-up emails.
Evaluate infrastructure capabilities against minimum requirements.
Assess the partner’s infrastructure based on questionnaire responses and documentation. For Tier 3+ assessments, request evidence for critical claims (screenshots, configuration exports, policy documents).
Network and connectivity assessment:
Verify the partner has adequate connectivity for planned collaboration. A partner entering data into your cloud-based case management system needs reliable internet access. Partners in areas with intermittent connectivity need offline capability or alternative data submission methods.
Record: Primary connection type, backup connectivity, typical bandwidth, reliability issues.
Endpoint management assessment:
Determine how partner devices are managed and secured. Minimum acceptable varies by tier: Tier 2 collaboration can proceed with basic antivirus and manual updates; Tier 4+ requires managed devices with enforced security policies.
Record: Device types in use, management approach (MDM, manual, none), security software, patching approach, encryption status.
Identity and access assessment:
Evaluate how partner staff authenticate to systems. Individual accounts with passwords meet minimum requirements for Tier 2. MFA is required for Tier 3+. Centralised identity management (directory service) indicates higher maturity.
Record: Identity provider (if any), authentication methods, MFA status, password policy, account lifecycle process.
Evaluate security posture against minimum requirements.
Security assessment examines policies, technical controls, and operational practices. Weight findings by collaboration tier: gaps acceptable for Tier 2 partnerships become blocking issues for Tier 4+.
Policy assessment:
Review submitted policies for completeness and implementation evidence. A policy document alone indicates awareness; implementation evidence (training records, incident logs, audit results) indicates operational security.
Required policies by tier:
Policy Tier 2 Tier 3 Tier 4 Tier 5 Acceptable use Required Required Required Required Information security Recommended Required Required Required Data protection/privacy Required Required Required Required Incident response Not required Recommended Required Required Access control Not required Required Required Required Staff vetting Not required Not required Recommended Required Technical controls assessment:
Evaluate security controls based on questionnaire responses and any evidence provided. For Tier 4+ assessments, request verification evidence (configuration screenshots, scan reports, audit logs).
Score each control area:
CONTROL ASSESSMENT SCORING
3 - Fully implemented: Control exists, documented, evidence of operation 2 - Partially implemented: Control exists but gaps in coverage or documentation 1 - Planned: Control recognised as needed, implementation in progress 0 - Not implemented: Control absent or not applicable response without justificationControl areas for assessment:
Perimeter security: Firewall present, configured, managed. Score 3 requires managed firewall with documented ruleset; score 0 indicates no firewall or reliance on ISP-provided device without configuration.
Endpoint protection: Antivirus/EDR deployed, updated, monitored. Score 3 requires managed endpoint protection with central visibility; score 1 indicates consumer antivirus without management.
Data encryption: At rest and in transit. Score 3 requires full disk encryption on endpoints and TLS for all data transmission; score 0 indicates no encryption.
Backup and recovery: Regular backups, tested restores. Score 3 requires automated backups with documented restore testing; score 1 indicates manual backups without testing.
Vulnerability management: Scanning, patching, remediation. Score 3 requires regular scanning with tracked remediation; score 0 indicates no vulnerability awareness.
Incident response: Process, contacts, testing. Score 3 requires documented process with identified responders and annual testing; score 1 indicates informal process without documentation.
Evaluate application and data handling practices.
Assess how the partner handles data relevant to your collaboration. This section carries greatest weight for data sharing partnerships.
Application security:
If partners will access your applications, evaluate their endpoint security. If you will receive data from partner applications, evaluate those applications’ security.
For partner-hosted applications you will integrate with:
- Authentication method (local accounts, SSO, federated)
- Authorisation model (role-based, attribute-based, custom)
- Audit logging (present, complete, retained)
- Security testing (penetration testing, code review, neither)
- Hosting (self-hosted, cloud provider, shared hosting)
Data handling practices:
Evaluate data lifecycle practices through questionnaire responses and policy review.
Data classification: Does the partner have a data classification scheme? Do they classify data shared with your organisation appropriately?
Access control: How is access to sensitive data restricted? Individual basis, role basis, or unrestricted within organisation?
Retention and disposal: What retention period applies to shared data? How is data disposed when no longer needed?
Cross-border transfers: Where is data stored? If outside partner’s country, under what legal basis? If cloud-hosted, which jurisdiction?
Calculate risk rating based on assessment findings.
Risk rating combines control scores with collaboration tier to produce an overall partnership risk level. Higher tiers require higher scores to achieve acceptable risk.
Calculate control score:
Sum individual control scores (0-3 each) across all assessed areas. Maximum score depends on scope: Tier 2 assessments evaluate 6 controls (max 18); Tier 4+ assessments evaluate 12 controls (max 36).
CONTROL SCORE CALCULATION
Control area | Score (0-3) --------------------------------|------------ Perimeter security | [ ] Endpoint protection | [ ] Data encryption | [ ] Backup and recovery | [ ] Vulnerability management | [ ] Incident response | [ ] Identity management | [ ] (Tier 3+) Access control | [ ] (Tier 3+) Security monitoring | [ ] (Tier 4+) Physical security | [ ] (Tier 4+) Staff vetting | [ ] (Tier 5) Protection protocols | [ ] (Tier 5) --------------------------------|------------ Total | [ ] / [max] Percentage | [ ]%Determine risk rating:
| Control score | Tier 1-2 | Tier 3 | Tier 4 | Tier 5 |
|---|---|---|---|---|
| 80-100% | Low | Low | Low | Medium |
| 60-79% | Low | Medium | Medium | High |
| 40-59% | Medium | Medium | High | Critical |
| Below 40% | Medium | High | Critical | Critical |
Risk rating implications:
Low risk: Proceed with partnership. Standard monitoring applies.
Medium risk: Proceed with documented risk acceptance and specific mitigations. Review at 6 months.
High risk: Proceed only with executive approval, significant mitigations, and enhanced monitoring. Review at 3 months.
Critical risk: Do not proceed without substantial remediation. Reassess after remediation complete.
Document findings and recommendations in the assessment report.
Compile assessment findings into a structured report. The report serves three purposes: informing partnership decisions, documenting due diligence for audit, and providing remediation guidance to the partner.
PARTNER TECHNOLOGY ASSESSMENT REPORT
Partner: [Organisation name] Assessment date: [YYYY-MM-DD] Assessor: [Name, role] Assessment tier: [1-5] Collaboration scope: [Description]
EXECUTIVE SUMMARY
Overall risk rating: [Low/Medium/High/Critical] Recommendation: [Proceed/Proceed with conditions/Do not proceed]
Key findings: - [Finding 1] - [Finding 2] - [Finding 3]
DETAILED FINDINGS
Infrastructure Assessment [Findings organised by area]
Security Assessment Control scores: [Table of control areas and scores]
[Narrative findings for each area]
Data Handling Assessment [Findings on data practices]
GAP ANALYSIS
Critical gaps (blocking): | Gap | Impact | Remediation required | |-----|--------|---------------------| | [Gap] | [Impact] | [Required action] |
Significant gaps (mitigatable): | Gap | Impact | Recommended mitigation | |-----|--------|----------------------| | [Gap] | [Impact] | [Mitigation] |
Minor gaps (acceptable): | Gap | Impact | Notes | |-----|--------|-------| | [Gap] | [Impact] | [Notes] |
RECOMMENDATIONS
For partnership decision: [Recommendation with rationale]
Conditions for proceeding (if applicable): 1. [Condition] 2. [Condition]
For partner remediation: [Prioritised remediation recommendations]
APPENDICES A. Completed questionnaire B. Supporting documentation received C. Evidence reviewedCommunicate findings to stakeholders and partner.
Share the assessment report with your internal stakeholders (partnership manager, security lead, data protection officer if applicable) before communicating with the partner.
Internal communication includes the full report and your recommendation. Allow 3-5 days for internal review and decision.
Partner communication depends on outcome:
For proceed recommendations: Share findings summary, highlight strengths, provide remediation recommendations for gaps identified. Frame as collaborative improvement, not audit failure.
For conditional proceed: Clearly state conditions required before collaboration begins. Offer support for remediation where appropriate. Set timeline for re-assessment.
For do not proceed: Explain findings factually without judgement. Identify what would need to change for future reassessment. Maintain relationship for future opportunities.
Assessment Workflow
+--------------------------------------------------------------------+| PARTNER TECHNOLOGY ASSESSMENT |+--------------------------------------------------------------------+| || +----------------+ +----------------+ +----------------+ || | SCOPE | | QUESTIONNAIRE | | REVIEW | || | | | | | | || | - Confirm tier |---->| - Send to |---->| - Check | || | - Select | | partner | | completeness | || | components | | - Set deadline | | - Flag gaps | || | - Document | | - Track | | - Request | || | scope | | response | | clarification| || +----------------+ +----------------+ +-------+--------+ || | || v || +----------------+ +----------------+ +-------+--------+ || | REPORT | | RISK RATING | | DETAILED | || | | | | | ASSESSMENT | || | - Compile |<----| - Calculate |<----| - Infra | || | findings | | control | | - Security | || | - Gap analysis | | scores | | - Data | || | - Recommend | | - Determine | | handling | || | | | rating | | - Applications | || +-------+--------+ +----------------+ +----------------+ || | || v || +-------+--------+ +----------------+ +----------------+ || | INTERNAL | | DECISION | | PARTNER | || | REVIEW | | | | COMMUNICATION | || | |---->| - Proceed |---->| - Share | || | - Stakeholders | | - Conditional | | findings | || | - Approval | | - Do not | | - Remediation | || | | | proceed | | guidance | || +----------------+ +----------------+ +----------------+ || |+--------------------------------------------------------------------+Capability Maturity Assessment
For Tier 4+ assessments, supplement control scoring with capability maturity assessment. Maturity levels indicate sustainability of security practices over time.
+--------------------------------------------------------------------+| CAPABILITY MATURITY MODEL |+--------------------------------------------------------------------+| || Level 5: OPTIMISING || +--------------------------------------------------------------+ || | Continuous improvement based on metrics. Security embedded | || | in organisational culture. Proactive threat anticipation. | || +--------------------------------------------------------------+ || ^ || Level 4: MANAGED | || +--------------------------------------------------------------+ || | Metrics collected and used for decision-making. Controls | || | regularly tested. Security integrated with operations. | || +--------------------------------------------------------------+ || ^ || Level 3: DEFINED | || +--------------------------------------------------------------+ || | Documented policies and procedures. Consistent practices | || | across organisation. Training programme established. | || +--------------------------------------------------------------+ || ^ || Level 2: DEVELOPING | || +--------------------------------------------------------------+ || | Basic controls in place. Some documentation. Practices | || | inconsistent or dependent on individuals. | || +--------------------------------------------------------------+ || ^ || Level 1: INITIAL | || +--------------------------------------------------------------+ || | Ad hoc practices. No documentation. Reactive to incidents. | || | Security dependent on individual awareness. | || +--------------------------------------------------------------+ || |+--------------------------------------------------------------------+Map questionnaire responses and evidence to maturity levels for each domain. Tier 4 partnerships require minimum Level 2 across all domains. Tier 5 partnerships require minimum Level 3 for data handling and incident response.
Risk Rating Matrix
+-----------------------------------------------------------------------+| RISK RATING MATRIX |+-----------------------------------------------------------------------+| COLLABORATION TIER || | || | Tier 1-2 | Tier 3 | Tier 4 | Tier 5 || CONTROL | Aggr/Read | Write | Personal | Protection || SCORE | Access | Access | Data | Data ||--------------+------------+------------+------------+-----------------|| 80-100% | LOW | LOW | LOW | MEDIUM || (Strong) | Proceed | Proceed | Proceed | Proceed w/ || | | | | monitoring ||--------------+------------+------------+------------+-----------------|| 60-79% | LOW | MEDIUM | MEDIUM | HIGH || (Adequate) | Proceed | Proceed | Proceed | Exec approval || | | w/ plan | w/ plan | + mitigations ||--------------+------------+------------+------------+-----------------|| 40-59% | MEDIUM | MEDIUM | HIGH | CRITICAL || (Gaps) | Proceed | Proceed | Exec | Do not proceed || | w/ plan | w/ plan | approval | Remediate first ||--------------+------------+------------+------------+-----------------|| Below 40% | MEDIUM | HIGH | CRITICAL | CRITICAL || (Inadequate) | Review | Exec | Do not | Do not proceed || | needed | approval | proceed | ||--------------+------------+------------+------------+-----------------|| || LOW: Standard monitoring | MEDIUM: Documented mitigations || HIGH: Executive approval + enhanced monitoring || CRITICAL: Do not proceed without remediation |+-----------------------------------------------------------------------+Verification
Assessment completion requires confirmation of these outcomes:
Questionnaire completeness: All required sections contain responses. Missing sections have documented follow-up or justified exclusion.
Evidence reviewed: For Tier 3+ assessments, policy documents received and reviewed. For Tier 4+, technical evidence (configuration screenshots, scan reports) received for critical controls.
Control scores assigned: Every assessed control has a score (0-3) with documented rationale.
Risk rating calculated: Overall risk rating determined using the matrix. Rating justified by control scores and collaboration tier.
Recommendations provided: Clear recommendation (proceed, conditional proceed, do not proceed) with rationale. For conditional proceed, specific conditions documented.
Report complete: Assessment report contains all required sections. Gap analysis identifies blocking, significant, and minor gaps.
Stakeholder review: Internal stakeholders reviewed findings and approved recommendation. Decision documented.
Partner communication: Partner informed of assessment outcome. For conditional proceed, conditions communicated in writing.
Troubleshooting
| Issue | Cause | Resolution |
|---|---|---|
| Partner does not respond to questionnaire | Contact incorrect, questionnaire lost, partner deprioritised | Verify contact details, resend questionnaire, escalate to partnership manager for follow-up |
| Questionnaire returned incomplete | Questions unclear, partner lacks information, partner avoiding disclosure | Send consolidated list of missing items with clarification, offer call to discuss, extend deadline 5 days |
| Partner refuses to provide evidence | Confidentiality concerns, evidence does not exist, trust issues | Explain evidence handling (will not share externally), accept alternative evidence forms, document refusal in findings |
| Control scores difficult to determine | Vague questionnaire responses, conflicting information | Request specific clarification, document uncertainty in assessment, apply precautionary scoring |
| Partner disputes findings | Misunderstanding, outdated information, genuine disagreement | Review evidence together, accept corrections with evidence, document disputed findings with both perspectives |
| Assessment reveals unexpected critical gaps | Partner less mature than expected, scope underestimated | Do not adjust findings to fit desired outcome, document accurately, consider whether partnership should proceed |
| Partner requests assessment reciprocity | Fair request, relationship management | Agree if reasonable, provide equivalent information, negotiate scope |
| Assessment timeline insufficient | Complex environment, partner delays, assessor capacity | Request deadline extension, reduce scope if appropriate, document limitations |
| Partner recently experienced breach | Current security state uncertain, recovery incomplete | Assess current controls, request breach details if willing to share, consider elevated risk rating |
| Donor requires specific assessment framework | USAID, FCDO, EU have different requirements | Map your assessment to donor framework, supplement with donor-specific questions, document compliance |
Partner IT Assessment Template
Use this questionnaire template for partner assessments. Remove sections not applicable to your assessment tier.
PARTNER IT ASSESSMENT QUESTIONNAIRE
Instructions: Complete all sections marked as required for your assessment tier. Where a question is not applicable, respond “N/A” with brief explanation. Attach supporting documentation where requested.
SECTION A: ORGANISATION PROFILE (All tiers)
A1. Organisation name:
A2. Primary office location (city, country):
A3. Number of staff using IT systems:
A4. Number of office locations:
A5. IT staffing:
- No dedicated IT staff
- Part-time IT responsibility
- 1 dedicated IT person
- 2-5 IT staff
- More than 5 IT staff
A6. Describe your IT support arrangements (internal, outsourced, vendor support):
SECTION B: DATA HANDLING (All tiers)
B1. Do you have a data protection/privacy policy?
- Yes (attach copy)
- In development
- No
B2. Do you have a data classification scheme?
- Yes (describe levels)
- No
B3. What types of personal data do you handle? (Select all that apply)
- Staff data only
- Beneficiary contact information
- Beneficiary demographic data
- Health data
- Financial data
- Protection/safeguarding case data
- Children’s data
- Other sensitive personal data (describe)
B4. Where is personal data stored? (Select all that apply)
- Local devices
- Organisation servers (on-premises)
- Cloud storage (specify provider and region)
- Partner systems
- Paper records
B5. How long do you retain personal data?
B6. How is personal data disposed when no longer needed?
B7. Do you transfer personal data across international borders?
- No
- Yes (describe destinations and legal basis)
SECTION C: INFRASTRUCTURE (Tier 2+)
C1. Describe your internet connectivity:
- Primary connection type and speed:
- Backup connection (if any):
- Reliability issues:
C2. What devices do staff use? (Select all that apply)
- Organisation-owned desktops
- Organisation-owned laptops
- Organisation-owned tablets
- Organisation-owned mobile phones
- Personal devices (BYOD)
C3. How are devices managed?
- Mobile Device Management (MDM) - specify product
- Domain/directory management
- Manual management
- Not centrally managed
C4. What operating systems are in use?
- Desktops/laptops:
- Mobile devices:
C5. Do you have servers on-premises?
- No
- Yes (describe purpose and location)
C6. What cloud services do you use? (List primary services)
SECTION D: SECURITY CONTROLS (Tier 2+)
D1. Do you have an information security policy?
- Yes (attach copy)
- In development
- No
D2. What endpoint protection is deployed?
- None
- Consumer antivirus (specify)
- Business antivirus (specify)
- Endpoint Detection and Response (specify)
D3. Is full disk encryption enabled on devices?
- Yes, all devices
- Yes, some devices
- No
D4. Do you have a firewall?
- Yes, managed/configured
- Yes, default configuration
- ISP-provided only
- No
D5. How do staff authenticate to systems?
- Individual accounts with passwords
- Individual accounts with MFA
- Shared accounts
- Mixed (describe)
D6. What is your password policy?
- Minimum length:
- Complexity requirements:
- Expiry period:
D7. Is multi-factor authentication (MFA) enabled?
- Yes, all systems
- Yes, some systems (specify which)
- No
D8. How are user accounts created and removed?
- Creation process:
- Removal process:
- Typical time to remove departing staff access:
SECTION E: BACKUP AND RECOVERY (Tier 3+)
E1. What data is backed up?
E2. How frequently are backups performed?
E3. Where are backups stored?
- Same location as source
- Different physical location
- Cloud backup service (specify)
E4. Are backups encrypted?
- Yes
- No
E5. When did you last test restoring from backup?
- Within last month
- Within last 3 months
- Within last year
- Never tested
- Unknown
E6. What is your target recovery time for critical systems?
SECTION F: INCIDENT RESPONSE (Tier 3+)
F1. Do you have an incident response process?
- Yes, documented (attach or describe)
- Yes, informal
- No
F2. Who is responsible for responding to security incidents?
- Name/role:
- Contact details:
F3. Have you experienced a security incident in the past 24 months?
- No
- Yes (describe type, not details)
F4. Do you have cyber insurance?
- Yes
- No
SECTION G: VULNERABILITY MANAGEMENT (Tier 4+)
G1. Do you perform vulnerability scanning?
- Yes, regularly (frequency: ___)
- Yes, occasionally
- No
G2. How are software updates/patches applied?
- Automatically
- Manually, within 30 days of release
- Manually, less frequently
- Inconsistently
G3. When were operating systems last updated?
- Servers:
- Desktops:
- Mobile devices:
SECTION H: PHYSICAL SECURITY (Tier 4+)
H1. How is physical access to offices controlled?
H2. How is physical access to server/network equipment controlled?
H3. What happens to devices at end of life?
SECTION I: STAFF AND TRAINING (Tier 4+)
I1. Do staff receive security awareness training?
- Yes, at onboarding
- Yes, annually
- Yes, more frequently
- No
I2. Do staff undergo background checks?
- Yes, all staff
- Yes, some roles
- No
I3. Do you have an acceptable use policy?
- Yes (attach copy)
- No
SECTION J: PROTECTION-SPECIFIC (Tier 5 only)
J1. What systems handle protection/safeguarding case data?
J2. How is access to protection data restricted?
J3. What additional controls exist for protection data?
J4. How is consent for protection data documented?
J5. What is your process for secure information sharing with other protection actors?
J6. Describe physical security for locations where protection data is accessed:
DECLARATION
I confirm the information provided is accurate to the best of my knowledge.
Name: Role: Date: Signature:
See also
- Third-Party Security Risk -risk concepts and frameworks
- Partner Onboarding -post-assessment onboarding procedures
- Partner System Integration -technical integration procedures
- Responsible Data Framework -data sharing principles
- Donor IT Requirements -donor-specific assessment requirements