Using Zyte API from Scrapy Cloud#
Following this guide you will learn how to deploy your Scrapy project from the Zyte API tutorial into Scrapy Cloud, and run your spiders on the cloud.
Before you start to follow this guide, you need to:
Deploy your Scrapy project to Scrapy Cloud#
Now that you have a Scrapy project that uses Zyte API, you will deploy it to a Scrapy Cloud project.
shub, the Scrapy Cloud command-line application:
pip install --upgrade shub
Create a text file at
zyte-api-tutorial/requirements.txtwith the following content:
Create a YAML file at
zyte-api-tutorial/scrapinghub.ymlwith the following content:
requirements: file: requirements.txt stacks: default: scrapy:2.7
Run the following command and, when prompted, paste your API key and press Enter:
On the Zyte dashboard, select your Scrapy Cloud project under Scrapy Cloud Projects, and copy your Scrapy Cloud project ID from the web browser URL bar.
For example, if the URL is
000000is your Scrapy Cloud project ID.
zyte-api-tutorialis your current working folder.
Run the following command, replacing
000000with your actual project ID:
shub deploy 000000
Your Scrapy project has now been deployed to your Scrapy Cloud project.
Run a Scrapy Cloud job#
Now that you have deployed your Scrapy project to your Scrapy Cloud project, it is time to run one of your spiders on Scrapy Cloud:
On the Zyte dashboard, select your Scrapy Cloud project under Scrapy Cloud Projects.
On the Dashboard page of your project, select Run on the top-right corner.
On the Run dialog box:
Select the Spiders field and, from the spider list that appears, select your spider name.
A new Scrapy Cloud job will appear in the Running job list:
Once the job finishes, it will move to the Completed job list:
Follow the link from the Job column, 1/1.
On the job page, select the Items tab.
On the Items page, select Export → CSV.
The downloaded file will have the same data as the
books.csv file that you
generated locally earlier.
Now that you know how to deploy your code and run a job in Scrapy Cloud, see Scrapy Cloud usage for more in-depth documentation of Scrapy Cloud.