In the context countries around the globe are paying more attention to environmental issues, policies to tighten emissions are also being promulgated by many countries and regions recently. Under the Paris Agreement on climate change, emission trade exchange is a direction to gather and attract green financial flows for investment in sustainable development, contributing to emission reduction commitments. For enterprises, emission can be defined as another type of manageable asset. If they exceed the quota, they have to pay for it. And if they don't use it all, they bring it to the exchange market to sell to other businesses. Thus, the number of emissions trading systems in the global is increasing, including national or sub-national systems. This big data created from the systems can be collected and analyzed for the decision-making process in trading systems. Robotic process automation is a valuable technology to reduce and automate repetitive tasks by simulating human workflow. RPA technology can conduct repetitive routines and result in a more efficient and effective workflow as well as reduce the mistakes of human working. The research focuses on how RPA crawling bot can be used to automatic process data extraction in the emission trading websites into format input data type (Comma Separated Values - CSV files) of the operational data store for both IT and non-IT users every day. To get the aim, I propose a high-level RPA system architecture, which can crawl big data from multiple data sources and pre-process before being stored in the main database. I also integrate Apache Airflow to monitor all processes of the RPA crawling bot. Finally, I show an experimental prototype of a robotic automation process system that can crawl trading data in real-time. This research will be a valuable reference document in the future real trading market by providing this method to improve accuracy as well as the ability to crawl gas emission trading data, which is more essential and hard to predict nowadays.