For integration with the Nette Framework use the provided compiler extension. The minimum configuration looks like this:
extensions:
crawler_client: SixtyEightPublishers\CrawlerClient\Bridge\Nette\DI\CrawlerClientExtension
crawler_client:
crawler_host_url: <full url to your crawler instance>
Requests to the Crawler API must always be authenticated, so we must provide credentials.
crawler_client:
crawler_host_url: <full url to your crawler instance>
credentials:
username: <username>
password: <password>
If you don't want to have credentials hardcoded in the configuration (for example, you want to get them from the database), you can write your own class implementing the CredentialsInterface
and register it as a service.
crawler_client:
crawler_host_url: <full url to your crawler instance>
services:
- MyCredentialsService
The client uses the Guzzle library to communicate with the Crawler API. If you want to pass some custom options to the configuration for Guzzle, use the guzzle_config
section.
crawler_client:
crawler_host_url: <full url to your crawler instance>
guzzle_config:
# your custom options e.g.
timeout: 0
Custom middlewares can be registered as follows.
crawler_client:
crawler_host_url: <full url to your crawler instance>
middlewares:
- MyCustomMiddleware
- AnotherCustomMiddleware()
You can read more about middleware here.
The default serializer implementation can also be replaced by the following option.
crawler_client:
crawler_host_url: <full url to your crawler instance>
serializer: MyExtendedSerializer()
You can read more about the serializer here.