As a ‘New Kid on the Block’, ChatGPT can be alluring even to Church circles wanting first place to automate analysis from survey exercises for instance.
However, considering complex security risks that envelope this new Artificial Intelligence technology, jumping into its utility without due diligence can invite leaks or even data loss or data inflation at unexpected times. It is important to note that personal identifiable information, once introduced into ‘the system’, becomes part of its knowledge bank and can potentially appear in other organisations’ work or in their data bank.
Additionally, since ChatGPT is connected to the internet, cyber attacks can occur. Data can be breached by unauthorised parties and re-introduced as original information [even if it has been tempered with]. Hence, the Church must, with due diligence, implement strong security protocols and unique guidelines for the use of ChatGPT within it’s workspace.
As the Church moves forward in dealing with ChatGPT, my strong recommendation is to limit access to this new technology to only those who are trained and appointed to use it.