Back to HomeInformation Security

Security Incident Reporting Complete Guide: Process, Deadlines, FAQ [2025]

10 min min read
#Security#Incident Reporting#Security Incident#Cybersecurity Law#Incident Response

Security Incident Reporting Complete Guide: Process, Deadlines, FAQ

A security incident has occurred. What should you do?

Besides technical handling, there's one more important thing: reporting.

The Cybersecurity Management Act requires certain organizations to report security incidents. Failure to report or late reporting may result in penalties.

This article explains the complete security incident reporting process.

After reading, you'll know: what situations require reporting, how soon to report, how to report, and to whom.

What is a Security Incident?

First, let's define clearly: what counts as a "security incident"?

Definition of Security Incident

According to the Cybersecurity Management Act enforcement rules, a security incident is:

A system, service, or network condition that, after identification, indicates a possible violation of cybersecurity policies or protection measure failures, or previously unknown situations that may be security-related.

In plain language:

  • System intrusion or attempted intrusion
  • Data theft, leakage, or tampering
  • Service interruption or anomalies
  • Other events affecting information security

Common Security Incident Types

TypeExamples
MalwareRansomware, viruses, trojans
Intrusion attacksSystem hacked, backdoor planted
Data breachPersonal data leak, confidential data theft
Service disruptionDDoS attack, system crash
Account abuseAccount stolen, privilege abuse
Website defacementWeb page replaced, malware injected
Phishing fraudSuccessful phishing email attack

Situations That May Not Require Reporting

The following situations may not constitute reporting obligations:

  • Attacks blocked by firewall/antivirus
  • Received phishing email but didn't click
  • Vulnerability scanning found flaws (but not yet exploited)
  • Brief interruption due to system maintenance

But if you're unsure, it's recommended to report anyway. Better to over-report than miss one.

Security Incident Reporting Obligations

Who must report? What situations require reporting?

Obligated Reporters

Under the Cybersecurity Management Act, those with reporting obligations include:

Government Agencies

All government agencies must report security incidents.

Specific Non-Government Agencies

Designated specific non-government agencies have reporting obligations. Including:

  • Critical infrastructure providers
  • State-owned enterprises
  • Government-funded foundations

Reporting Recipients

Government Agencies

Report to the following:

  1. Supervisory agency
  2. Administration for Cyber Security, Ministry of Digital Affairs

Specific Non-Government Agencies

Report to central competent authorities.

For example:

  • Financial industry → Financial Supervisory Commission
  • Telecommunications industry → NCC
  • Energy industry → Ministry of Economic Affairs

What Situations Require Reporting?

Simply put: report when you discover a security incident.

But there's a prerequisite: the incident must reach a certain level of impact.

Must Report

  • System intrusion
  • Data breach
  • Service disruption exceeding certain duration
  • Affects other agencies or the public

Case-by-Case

  • Minor incidents (e.g., single computer virus, already cleaned)
  • Attempted attacks (successfully blocked)

In practice, use the "better safe than sorry" principle. Report if in doubt, let the authority determine.

Security Incident Reporting Deadlines

Reporting has deadlines. Missing deadlines may result in penalties.

Incident Levels and Deadlines

Security incidents are classified into four levels, each with different reporting deadlines:

LevelDefinitionInitial ReportDetailed Report
Level 4Affects other agencies or publicWithin 1 hourWithin 8 hours
Level 3Core business unable to operateWithin 8 hoursWithin 24 hours
Level 2Core business affected but operationalWithin 24 hoursWithin 72 hours
Level 1Non-core business affectedWithin 72 hoursWithin 7 days

How to Determine Level?

Level 4 (Most Severe)

Characteristics:

  • Impact extends beyond organization
  • May cause public rights damage
  • Draws public attention

Examples:

  • Mass personal data breach
  • Critical service paralysis
  • Attack involving national security

Level 3

Characteristics:

  • Core systems unable to operate
  • Business shutdown

Examples:

  • Main systems ransomed
  • Critical database corrupted
  • Extended service interruption

Level 2

Characteristics:

  • Core systems affected but still usable
  • Performance degraded but not shutdown

Examples:

  • Some systems intruded
  • Confidential data possibly leaked
  • Intermittent service disruption

Level 1

Characteristics:

  • Non-core systems affected
  • Limited impact scope

Examples:

  • Single computer infected
  • Test environment intruded
  • Minor data anomalies

When Does the Clock Start?

When does the deadline start counting?

Time of Awareness

Starts from "awareness of the incident."

Awareness = anyone in the organization discovers and confirms it's a security incident.

For example:

  • IT personnel receives alert
  • Employee reports anomaly
  • External notification

Not the Time of Occurrence

The incident may have occurred two weeks ago, but you discovered it today. The deadline starts from today.

Consequences of Late Reporting

Administrative Penalties

Failure to report within required deadline: NT$300,000-5,000,000.

Consecutive penalties possible (each delay counts).

Other Impacts

  • Competent authority scrutiny
  • Enhanced audits
  • Reputation damage

Reporting Platform Operations

How do you actually report?

Reporting Channels

Government Agencies

Use the "Government Information Security Incident Reporting Platform" (G-ISAC).

URL: https://gisac.nat.gov.tw

Specific Non-Government Agencies

Use the "National Information Sharing and Analysis Center" (N-ISAC).

URL: https://www.nisac.nat.gov.tw

Or report through channels designated by central competent authorities.

Reporting Process

Step 1: Log into System

Log into the reporting platform with your agency account.

(If you don't have an account, apply to the competent authority first)

Step 2: Create Report

Fill out the report form, including:

  • Incident occurrence time
  • Discovery time
  • Incident type
  • Impact scope
  • Initial description

Step 3: Submit Initial Report

Complete the initial report. The system will assign a case number.

Step 4: Submit Detailed Report

Within the specified time, supplement detailed information:

  • Affected systems
  • Damage assessment
  • Handling status
  • Technical details

Step 5: Closure Report

After incident handling is complete, submit closure report:

  • Incident cause
  • Handling methods
  • Improvement measures
  • Lessons learned

Key Report Content

A good report should include:

Basic Information

  • Contact person and contact method
  • Incident timeline
  • Initial level determination

Technical Information

  • Affected systems/services
  • Attack methods (if known)
  • Suspicious IPs, malware characteristics

Impact Assessment

  • Whether data was breached
  • Whether service was interrupted
  • Number of people/scope affected

Handling Status

  • Actions already taken
  • Current situation
  • Support needed

Security Incident Handling Process

Reporting is just one part. The complete incident handling process is as follows:

Phase 1: Detection and Identification

Discover Anomalies

Possible discovery sources:

  • Monitoring system alerts
  • Employee reports
  • External notifications
  • Abnormal logs

Confirm Incident

Initial judgment:

  • Is this a real attack or false positive?
  • How large is the impact scope?
  • What's the incident level?

Phase 2: Containment

Stop the Spread

Immediate actions:

  • Isolate infected systems
  • Block malicious IPs
  • Disable stolen accounts
  • Protect critical data

Preserve Evidence

Don't rush to clean up:

  • Preserve system logs
  • Save malware samples
  • Document handling process
  • Take screenshots

Phase 3: Reporting

Initial Report

Complete initial report within deadline:

  • Basic incident description
  • Initial impact assessment
  • Handling status

Detailed Report

Supplement detailed information:

  • Technical details
  • Damage assessment
  • Handling progress

Phase 4: Investigation and Eradication

Investigate Cause

Find out:

  • How did the attacker get in?
  • What vulnerability was exploited?
  • What did they do?
  • Did they leave backdoors?

Eradicate Threats

  • Remove malware
  • Patch vulnerabilities
  • Close backdoors
  • Reset account passwords

Phase 5: Recovery

System Recovery

  • Restore from backup
  • Rebuild damaged systems
  • Verify systems are normal

Service Restoration

  • Gradually restore services
  • Monitor for anomalies
  • Confirm stable operation

Phase 6: Review and Improvement

Post-Incident Review

  • Reconstruct incident sequence
  • Analyze handling successes and failures
  • Identify improvement points

Improvement Measures

  • Strengthen protection
  • Update policies
  • Enhance training
  • Update plans

Had a security incident and don't know how to handle it? Incident response requires professional experience. Emergency Contact, we provide incident response support.

FAQ

Not sure if it's a security incident—should I report?

Recommend reporting.

You can first report as a "suspected security incident," then investigate and confirm.

The risk of missing a report is greater than a false report.

Incident already handled—should I still report?

Yes.

Your reporting obligation doesn't disappear just because you handled it.

Also, one purpose of reporting is to let authorities understand the situation. Even if you've handled it, this intelligence is still valuable.

Will reporting result in punishment?

Reporting itself won't result in punishment.

The Cybersecurity Management Act penalizes:

  • Failure to report as required
  • Failure to report within deadline
  • False report content

Proactive reporting is the correct behavior and won't be penalized for it.

Can I report anonymously?

Official reports cannot be anonymous. You need to fill in agency and contact information.

But if you've discovered someone else's security problem, you can submit anonymously through vulnerability reporting channels.

Will report data be made public?

No.

Report data is confidential and only visible to competent authorities and related units.

However, major incidents may be covered by media (not leaked from the reporting system).

Do small companies also need to report?

Depends on whether you're designated as a "specific non-government agency."

If not designated, there's no legal reporting obligation.

But if personal data breach is involved, you may need to report under the Personal Data Protection Act.

What if an incident occurs on a holiday?

Deadlines still apply.

Holidays are not a reason to extend reporting deadlines.

Recommend planning holiday duty and reporting procedures in advance.

Can I have a vendor help with reporting?

The reporting obligation is yours and cannot be fully delegated.

But vendors can assist with:

  • Filling out report content
  • Providing technical information
  • Assisting with incident handling

The final report submission must still be done by you.

For more on cybersecurity regulations, see Cybersecurity Management Act Complete Guide.

Next Steps

Security incident reporting is an obligation and also a way to protect yourself.

Recommended Actions

Pre-Incident Preparation

  1. Confirm your reporting obligations and recipients
  2. Apply for reporting platform accounts
  3. Establish internal reporting procedures
  4. Designate responsible persons and alternates
  5. Conduct reporting drills

When an Incident Occurs

  1. Calmly assess incident level
  2. Complete initial report within deadline
  3. Conduct technical handling simultaneously
  4. Continuously update report content
  5. Complete closure report

Related Resources

Extended reading:


Need Security Incident Response Support?

Every second counts when an incident occurs. Professional support can help you control damage faster.

CloudInsight provides:

  • Incident response consultation
  • Technical investigation support
  • Reporting assistance
  • Recovery recommendations

Emergency Contact, we help you handle security incidents.

Need Professional Cloud Advice?

Whether you're evaluating cloud platforms, optimizing existing architecture, or looking for cost-saving solutions, we can help

Book Free Consultation

Related Articles