As companies need to build applications to cater for distributed workforces, the use of public cloud[1] is a no-brainer, but as the parade[2] of companies[3] failing to secure S3 buckets shows, cloud customers are failing to secure it properly.

"What we've seen across the board is most customers consumption of public cloud is way ahead of their ability to secure it," Barracuda senior vice president of data protection, network and application security Tim Jefferson told ZDNet.

"They haven't figured out how to use the native services securely and how to instrument the controls because in many of those cases they're very developer focus, so you'd have to essentially be a software developer to really understand how the application teams are using the native services, and then get your head around the best way [of] architecting controls."

For Jefferson, he believes the solution is sitting right there on the cloud platforms themselves.

"The magic of public cloud is all the instrumentation and monitoring is done, it's sitting there for free, which is historically the most expensive and hard part to do on premise," he said.

"The trick now is just knowing how to call those APIs and suck in that telemetry and make it more actionable which companies like us have ... we can identify within seconds, every resource that's deployed, who deployed it, what its configuration state is, and how does that compare against best practice and then you can automate remediation."

See also: Cybersecurity starts with the network fundamentals[4]

Siran Eren, founder and CEO of zero trust access provider Fyde until November when the company was purchased by Barracuda, added it is possible to keep an eye on multiple clouds[5]

Read more from our friends at ZDNet