OR WAIT null SECS
There are three main changes to HIPAA coming Sept. 23, 2013, that medical practices need to know about. Here's what they are and what you should do.
The September 23, 2013, deadline for when covered entities such as physician practices must be in compliance with the HIPAA Omnibus Final Rule is quickly approaching. The rule marks the most sweeping changes to the HIPAA Privacy and Security Rules since they were first implemented.
While the final rule brings about many changes, there are three in particular that likely warrant the most attention from practices now. The following column identifies those changes and provides practical guidance to meet the new requirements.
Change 1: The definition of what a "breach" is has been modified.
What it means: Under the old law, a breach was an event that "compromises the security or privacy of the protected health information (PHI) such that the use or disclosure poses a significant risk of financial, reputational or other harm to the affected individual."
Under the new rule, the definition of a breach is expanded to include even just the "risk" of impermissible use or disclosure of PHI. For example, if you have patient records on a thumb drive and that drive is lost, if the records are not password-protected or encrypted, that will be considered a breach even if the data is never accessed by anyone. An incident report should be filed with your HIPAA officer.
If you lose a laptop but can prove the computer is encrypted and nobody is able to access the information without a secure ID, thus indicating a low probability of the PHI becoming compromised, you will not have committed a breach.
What practices should do: Perform a complete risk assessment in an effort to minimize security holes and prevent possible breaches. Three of the most common causes of breaches are stolen laptops, lost or stolen external hard drives or thumb drives, and sending PHI through unsecured email.
Change 2: The definition of a "business associate" (BA) has been completely reworded.
What it means: A BA is essentially a company or any person who is not a member of the workforce for the covered entity but has access to PHI. This would include contractors and now, under the new rule, subcontractors under the BA.
What practices should do: Review all BA agreements to see if they need to be revised or replaced. With older agreements, a BA could potentially include a clause that says the BA cannot be held liable for PHI breaches.
Now, a BA can be held directly liable. BAs can still try to include the clause to remove themselves and their subcontractors from liability, but a practice would be wise to object to such a request and a BA will lack a strong argument for the clause's inclusion.
An example of when a BA might be liable: If an IT company has an off-site data backup and somebody steals the backup device, the BA can be found personally liable for breach of all of the health records on that device.
An example of when a subcontractor might be liable: If the IT company were to bring in a subcontractor to run network cable and electric lines in a new service center, that subcontractor would then be considered a BA and potentially liable since it could have access to PHI.
Since all BAs are more stringent under the new HIPAA security laws, BAs themselves need to now remain HIPAA compliant.
Change 3: HIPAA audits will happen more frequently, fines will be substantially higher, and auditors will be incentivized to find security problems.
What it means: Periodic HIPAA audits by HHS were already authorized and underway, but covered entities can expect them to happen more frequently once the new rule is enacted.
In addition, fines associated with penalties due to HIPAA violations will become significantly higher and essentially without a limit.
Finally, auditors will receive what amounts to a "kickback" for each security violation discovered during an audit, which incentivizes auditors to dig deep and find any and all holes.
What practices should do: The best practice is to perform at least quarterly risk assessments. This will help ensure security hole fixes put in place are working and holding and identify other potential problems. If you can indicate to an auditor that you performed a risk assessment, identified a problem, and have a plan in place to fix it, the auditor is more likely not to consider the problem an issue unless it remains unresolved.
Many covered entities rely upon an external company to perform such risk assessments. These companies are not only skilled in identifying security problems and issues often overlooked by covered entity staff members, they have the knowledge and ability to take care of requirements such as creating policies and procedures for administrative safeguards, setting up employee training and changing all IT systems so they have a data backup plan, specific user names, and password policies in place.
Larger organizations may consider hiring someone to handle these responsibilities, but this may be cost prohibitive. For smaller organizations, it's often more cost-effective to hire a company to handle all of these tasks and help ensure year-round compliance.
Nelson Gomesis the president and CEO and Michael Daly is the senior systems engineer and HIPAA security officer of PriorityOne Group, a New Jersey-based healthcare IT consulting firm providing services to medical practices, ambulatory surgery centers and clinics. E-mail Nelson hereand Michael here.