Building websites and attracting big audiences to come and visit can take millions of dollars and years of hard work.
But more businesses are being ripped off and their sites “cloned” in a matter minutes using free “open source” tools that are widely available online.
At least several hundred popular international websites – including those of Paypal, Buzzfeed News, Adobe.com and Stuff – have been copied in an operation that started last month that was stumbled across by Stuff Limited.
But the find hasn’t raised an eyebrow among experts, who describe it as run of the mill and the internet as “the world’s largest copying machine”.
Stuff Ltd product architecture head Justis Chan came across Stuff webpages hidden behind a website address, www.hotelmosteiro.com, which is one of hundreds of “fronts” for copied websites registered to an address in Beijing.
The cloned sites were configured so they were not normally visible to regular internet users but could be seen by Google’s search engine bots which ply the web, indexing and ranking webpages.
Chan feared the “worst case scenario” was that the websites would be “decloaked” and then used to steal people’s usernames, passwords and other information, once they built up their “legitimacy” with the search engine.
There were signs that the owners were preparing the capability to carry out phishing attacks “at scale” using legitimate-sounding web addresses, he said.
But the sites could instead be used to “steal” web traffic and advertising revenues.
Cyber-security agency Cert NZ was notified of the cloning operation, and said such events happened regularly.
“Website ‘scraping’ involves taking copies of the content of one website, and hosting it on another. This can allow a scammer to impersonate a website,” it said in a statement.
“While it’s not always possible to guess the rationale for website scraping, it can be a precursor to other types of scams or attacks such as hosting malware, phishing sites, or search engine optimisation scams.”
Professor Ryan Ko, director of the New Zealand Institute for Security and Crime Science at Waikato University, confirmed it was not uncommon for websites to be copied on a large scale.
The people behind such activities could be looking to distribute malware but sometimes simply copied sites in order to understand how they were put together and to look for security vulnerabilities, he said.
In some cases “lazy web developers” copied websites to use as a template for customers.
“The possibilities are infinite,” Ryan said, suggesting it wasn’t even worth trying to guess at the motivation.
Peter Bailey, general manager of Auckland cyber security company Aura, said it would often taken copies of clients’ websites – but with their permission – to check for vulnerabilities.
“They know we are doing it, that’s the difference.”
Website scraping was common, he confirmed. “It is either people with malicious intent or you get designers copying sites.
“I think people should be aware if they see their own website copied. But there is a lot of it, so it may not mean you are being specifically targetted.”
United States security company Securi said in 2016 that the phenomenon was becoming more common and there was little that legitimate businesses could do to prevent it.
“Like most hackers and ‘black hats’, they will always find a way to get around any protection you put in place.
“Once they succeed in stealing your [search results], they can make sudden changes to the site for any kind of malicious/malware-serving purpose or even just to feed their ongoing spam campaigns.”
Google offers an online tool that lets website owners ask it to remove fake sites from its search results, but it is not usually an instant process.
A common way to request a take-down of the offending site itself is to file an online application through the “DMCA” process. The acronym stands for the US “Digital Millennium Copyright Act”.
But Google also did not express concern about the copied sites identified by Stuff Ltd.
“If people see search results that aren’t accurate or useful they can click on the feedback link at the bottom of any ‘search’ results page and send information back to Google,” spokesman Nic Hopkins said.
“There are also support and resources available for webmasters if they want to raise concerns about spam or malware.”
Bailey said cyber-crime had become much more organised over the past five years.
“In the past it was lots of individual hackers trying to break into sites. It has just become so organised and such big money that there are in effect illegal ‘companies’ working on getting people’s data.
“We are just seeing so much more of this stuff.”