New research from Decodo.com has identified the most common barriers businesses face when trying to collect or access external data, with budget and resource constraints topping the list.
The survey of 1,000 decision-makers found that 37% of businesses cite budget or resource limitations as their biggest challenge, followed by IP blocks or geo-restrictions (33%) and a lack of technical expertise (32%).
Poor data quality, CAPTCHAs and anti-bot measures, and limited internal infrastructure were each highlighted by 29% of respondents.
Vaidotas Juknys, head of commerce at Decodo, said: “Budget and resource constraints remain the biggest obstacles for businesses trying to scale their data operations.
“While high-quality data often comes at a cost, choosing a trusted provider ensures safer, more reliable solutions that pay off in the long run.
“One easy way to save money without losing quality is by comparing providers. We suggest choosing a provider that offers rotating residential IPs instead of large, cheap pools because this helps prevent your requests from getting blocked and saves you time.
“Good providers also offer built-in error handling and support, so your team spends less time fixing issues. Start small with affordable tools and only scale up as your data needs grow — it’s a tested approach to successfully keeping costs and resources under control.”
Anti-bot measures are also proving to be a significant obstacle. A third of businesses reported IP blocks, geo-restrictions, and CAPTCHAs as key technical issues when collecting public web data.
Juknys said: “These challenges are only growing as anti-bot measures become more advanced. We’ve developed a robust approach to overcome these limitations – leveraging proxy networks combined with intelligent web scrapers to reliably access public data at scale, without getting blocked.”
Justinas Tamaševičius, head of engineering at Decodo, added: “Overcoming IP blocks, geo-restrictions, and CAPTCHAs requires a combination of smart technology and global infrastructure. It’s not just about bypassing roadblocks – it’s about doing it right.
“We use proxies with ethically-sourced IPs that simulate real user behaviour and distribute requests in a way that’s both effective and compliant.
“Our all-in-one web scraper is designed to adapt to complex anti-bot systems. It can detect and respond to CAPTCHAs automatically, rotate IPs intelligently, and access 195+ global locations, all while keeping the user’s identity secure and operations under control.”
The research warns that using outdated scraping tools or unreliable proxy services increases the risk of being blocked and could harm both operations and reputation.