From edc872d57242fbd479834a462170096cca7622f9 Mon Sep 17 00:00:00 2001 From: semantic-release-bot Date: Sun, 14 Nov 2021 19:55:42 +0000 Subject: [PATCH] chore(release): 2.0.2 [skip ci] ## [2.0.2](https://github.com/armand1m/papercut/compare/v2.0.1...v2.0.2) (2021-11-14) ### Bug Fixes * add tests for pagination and increased coverage, fixes selector utilities ([#9](https://github.com/armand1m/papercut/issues/9)) ([eec651b](https://github.com/armand1m/papercut/commit/eec651bff2f018192d85030da86c017219ca85ab)) --- CHANGELOG.md | 7 +++++++ README.md | 4 ++-- docs/index.html | 4 ++-- docs/interfaces/CreateRunnerProps.html | 4 ++-- docs/interfaces/GeosearchResult.html | 2 +- docs/interfaces/RunProps.html | 10 +++++----- docs/interfaces/ScrapeProps.html | 2 +- docs/interfaces/ScraperOptions.html | 6 +++--- docs/interfaces/ScraperProps.html | 4 ++-- docs/modules.html | 21 ++++++++++++++++----- package.json | 2 +- 11 files changed, 42 insertions(+), 24 deletions(-) diff --git a/CHANGELOG.md b/CHANGELOG.md index ea2e84d..c9460e1 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -1,3 +1,10 @@ +## [2.0.2](https://github.com/armand1m/papercut/compare/v2.0.1...v2.0.2) (2021-11-14) + + +### Bug Fixes + +* add tests for pagination and increased coverage, fixes selector utilities ([#9](https://github.com/armand1m/papercut/issues/9)) ([eec651b](https://github.com/armand1m/papercut/commit/eec651bff2f018192d85030da86c017219ca85ab)) + ## [2.0.1](https://github.com/armand1m/papercut/compare/v2.0.0...v2.0.1) (2021-11-14) diff --git a/README.md b/README.md index 778c4db..d6095ec 100644 --- a/README.md +++ b/README.md @@ -131,10 +131,10 @@ Papercut works well out of the box, but some environment variables are available ## Roadmap -* \[-] Add unit tests +* [x] Add unit tests * [x] Add documentation generation * [ ] Create medium article introducing the library -* \[-] Create a gh-pages for the library +* [x] Create a gh-pages for the library * [x] Create more examples ## Contributing diff --git a/docs/index.html b/docs/index.html index 0da5c3d..3e04de6 100644 --- a/docs/index.html +++ b/docs/index.html @@ -88,10 +88,10 @@

Environment Variables

Roadmap

diff --git a/docs/interfaces/CreateRunnerProps.html b/docs/interfaces/CreateRunnerProps.html index b74be69..5058f1a 100644 --- a/docs/interfaces/CreateRunnerProps.html +++ b/docs/interfaces/CreateRunnerProps.html @@ -1,6 +1,6 @@ -CreateRunnerProps | @armand1m/papercut
Options
All
  • Public
  • Public/Protected
  • All
Menu

Interface CreateRunnerProps

Hierarchy

  • CreateRunnerProps

Index

Properties

Properties

logger

logger: Logger
+CreateRunnerProps | @armand1m/papercut
Options
All
  • Public
  • Public/Protected
  • All
Menu

Interface CreateRunnerProps

Hierarchy

  • CreateRunnerProps

Index

Properties

Properties

logger

logger: Logger

A pino.Logger instance.

-

options

+

options

The scraper options. Use this to tweak log, cache and concurrency settings.

Legend

  • Property

Settings

Theme

Generated using TypeDoc

\ No newline at end of file diff --git a/docs/interfaces/GeosearchResult.html b/docs/interfaces/GeosearchResult.html index 3dfe396..3aeb629 100644 --- a/docs/interfaces/GeosearchResult.html +++ b/docs/interfaces/GeosearchResult.html @@ -1 +1 @@ -GeosearchResult | @armand1m/papercut
Options
All
  • Public
  • Public/Protected
  • All
Menu

Interface GeosearchResult

Hierarchy

  • GeosearchResult

Index

Properties

latitude

latitude: number

longitude

longitude: number

Legend

  • Property

Settings

Theme

Generated using TypeDoc

\ No newline at end of file +GeosearchResult | @armand1m/papercut
Options
All
  • Public
  • Public/Protected
  • All
Menu

Interface GeosearchResult

Hierarchy

  • GeosearchResult

Index

Properties

latitude

latitude: number

longitude

longitude: number

Legend

  • Property

Settings

Theme

Generated using TypeDoc

\ No newline at end of file diff --git a/docs/interfaces/RunProps.html b/docs/interfaces/RunProps.html index ef80dea..9400e44 100644 --- a/docs/interfaces/RunProps.html +++ b/docs/interfaces/RunProps.html @@ -1,7 +1,7 @@ -RunProps | @armand1m/papercut
Options
All
  • Public
  • Public/Protected
  • All
Menu

Interface RunProps<T, B>

Type parameters

Hierarchy

  • RunProps

Index

Properties

baseUrl

baseUrl: string
+RunProps | @armand1m/papercut
Options
All
  • Public
  • Public/Protected
  • All
Menu

Interface RunProps<T, B>

Type parameters

Hierarchy

  • RunProps

Index

Properties

baseUrl

baseUrl: string

The base url to start scraping off.

This page will be fetched, parsed and mounted in a virtual JSDOM instance.

-

Optional pagination

pagination?: PaginationOptions
+

Optional pagination

pagination?: PaginationOptions

Optional pagination feature.

If enabled and configured, this will make papercut fetch, parse, mount and scrape multiple pages based @@ -9,14 +9,14 @@

As long as you have a way to fetch the last page number from the page you're scraping, and use it as a query param in the page url, you should be fine.

-

selectors

selectors: T
+

selectors

selectors: T

The selectors to be used during the scraping process.

The result object will match the schema of the selectors.

-

strict

strict: B
+

strict

strict: B

If enabled, this will make Papercut scrape the page in strict mode. This means that in case a selector function fails, the entire scraping will be halted with an error.

When enabled, the result types will not expect undefined values.

-

target

target: string
+

target

target: string

The DOM selector for the target nodes to be scraped.

Legend

  • Property

Settings

Theme

Generated using TypeDoc

\ No newline at end of file diff --git a/docs/interfaces/ScrapeProps.html b/docs/interfaces/ScrapeProps.html index f8230d5..72cd229 100644 --- a/docs/interfaces/ScrapeProps.html +++ b/docs/interfaces/ScrapeProps.html @@ -1 +1 @@ -ScrapeProps | @armand1m/papercut
Options
All
  • Public
  • Public/Protected
  • All
Menu

Interface ScrapeProps<T, B>

Type parameters

Hierarchy

  • ScrapeProps

Index

Properties

document

document: Document

logger

logger: Logger

options

selectors

selectors: T

strict

strict: B

target

target: string

Legend

  • Property

Settings

Theme

Generated using TypeDoc

\ No newline at end of file +ScrapeProps | @armand1m/papercut
Options
All
  • Public
  • Public/Protected
  • All
Menu

Interface ScrapeProps<T, B>

Type parameters

Hierarchy

  • ScrapeProps

Index

Properties

document

document: Document

logger

logger: Logger

options

selectors

selectors: T

strict

strict: B

target

target: string

Legend

  • Property

Settings

Theme

Generated using TypeDoc

\ No newline at end of file diff --git a/docs/interfaces/ScraperOptions.html b/docs/interfaces/ScraperOptions.html index 1de6aa4..f83c278 100644 --- a/docs/interfaces/ScraperOptions.html +++ b/docs/interfaces/ScraperOptions.html @@ -1,9 +1,9 @@ -ScraperOptions | @armand1m/papercut
Options
All
  • Public
  • Public/Protected
  • All
Menu

Interface ScraperOptions

Hierarchy

  • ScraperOptions

Index

Properties

cache

cache: boolean
+ScraperOptions | @armand1m/papercut
Options
All
  • Public
  • Public/Protected
  • All
Menu

Interface ScraperOptions

Hierarchy

  • ScraperOptions

Index

Properties

cache

cache: boolean

Enables HTML payload caching on the disk. Keep in mind that papercut will not clear the cache for you. When enabling this, it's your responsability to deal with cache invalidation.

default

false

-

concurrency

concurrency: { node: number; page: number; selector: number }
+

concurrency

concurrency: { node: number; page: number; selector: number }

Concurrency settings.

Type declaration

  • node: number

    Amount of concurrent promises for node scraping.

    @@ -14,7 +14,7 @@
  • selector: number

    Amount of concurrent promises for selector scraping.

    default

    2

    -

log

log: boolean
+

log

log: boolean

Enables writing pino logs to the stdout.

default

process.env.DEBUG === "true"

Legend

  • Property

Settings

Theme

Generated using TypeDoc

\ No newline at end of file diff --git a/docs/interfaces/ScraperProps.html b/docs/interfaces/ScraperProps.html index 2435cf9..45e58b3 100644 --- a/docs/interfaces/ScraperProps.html +++ b/docs/interfaces/ScraperProps.html @@ -1,7 +1,7 @@ -ScraperProps | @armand1m/papercut
Options
All
  • Public
  • Public/Protected
  • All
Menu

Interface ScraperProps

Hierarchy

  • ScraperProps

Index

Properties

Properties

name

name: string
+ScraperProps | @armand1m/papercut
Options
All
  • Public
  • Public/Protected
  • All
Menu

Interface ScraperProps

Hierarchy

  • ScraperProps

Index

Properties

Properties

name

name: string

The scraper name. This will be used only for logging purposes.

-

Optional options

options?: Partial<ScraperOptions>
+

Optional options

options?: Partial<ScraperOptions>

The scraper options. Use this to tweak log, cache and concurrency settings.

Legend

  • Property

Settings

Theme

Generated using TypeDoc

\ No newline at end of file diff --git a/docs/modules.html b/docs/modules.html index 6fe2796..a46f957 100644 --- a/docs/modules.html +++ b/docs/modules.html @@ -1,12 +1,12 @@ -@armand1m/papercut
Options
All
  • Public
  • Public/Protected
  • All
Menu

@armand1m/papercut

Index

Type aliases

ScrapeResultType

ScrapeResultType<T, B>: B extends true ? { [ Prop in keyof T]: ReturnType<T[Prop]> } : { [ Prop in keyof T]?: ReturnType<T[Prop]> }

Type parameters

Scraper

Scraper: ReturnType<typeof createScraper>

SelectorFunction

SelectorFunction: (utils: SelectorUtilities, self: SelectorMap) => any

Type declaration

SelectorMap

SelectorMap: Record<string, SelectorFunction>

Map of selector functions.

This type is meant to be checked with an extended type, as users are going to implement a derived version of this for custom scrapers.

-

SelectorUtilities

SelectorUtilities: ReturnType<typeof createSelectorUtilities>

Functions

Const createRunner

SelectorUtilities

SelectorUtilities: ReturnType<typeof createSelectorUtilities>

Functions

Const createRunner

  • Creates a runner instance.

    This method is called by the createScraper function, but can also be externally used if needed to use an @@ -33,7 +33,7 @@

Parameters

  • props: RunProps<T, B>

    The scraping runner properties and selectors.

Returns Promise<ScrapeResultType<T, B>[]>

result Type-safe scraping results based on the given selectors and strict mode.

-

Const createScraper

Const createScraper

  • Creates a new scraper runner.

    This method is papercut entrypoint. It will create an Scraper struct containing a runner that you can tweak @@ -63,7 +63,18 @@

Parameters

  • props: RunProps<T, B>

    The scraping runner properties and selectors.

Returns Promise<ScrapeResultType<T, B>[]>

result Type-safe scraping results based on the given selectors and strict mode.

-

Const createSelectorUtilities

  • createSelectorUtilities(element: Element): { all: (selector: string) => NodeListOf<Element>; attr: (selector: string, attribute: string) => string; className: (selector: string) => string; createWindowForHTMLContent: (htmlContent: string) => { close: () => void; window: DOMWindow }; element: Element; fetchPage: (url: string) => Promise<string>; geosearch: (q: string, limit?: number) => Promise<GeosearchResult>; href: (selector: string) => string; src: (selector: string) => string; text: (selector: string) => string }
  • Parameters

    • element: Element

    Returns { all: (selector: string) => NodeListOf<Element>; attr: (selector: string, attribute: string) => string; className: (selector: string) => string; createWindowForHTMLContent: (htmlContent: string) => { close: () => void; window: DOMWindow }; element: Element; fetchPage: (url: string) => Promise<string>; geosearch: (q: string, limit?: number) => Promise<GeosearchResult>; href: (selector: string) => string; src: (selector: string) => string; text: (selector: string) => string }

    • all: (selector: string) => NodeListOf<Element>
        • (selector: string): NodeListOf<Element>
        • Parameters

          • selector: string

          Returns NodeListOf<Element>

    • attr: (selector: string, attribute: string) => string
        • (selector: string, attribute: string): string
        • Parameters

          • selector: string
          • attribute: string

          Returns string

    • className: (selector: string) => string
        • (selector: string): string
        • Parameters

          • selector: string

          Returns string

    • createWindowForHTMLContent: (htmlContent: string) => { close: () => void; window: DOMWindow }
        • (htmlContent: string): { close: () => void; window: DOMWindow }
        • Parameters

          • htmlContent: string

          Returns { close: () => void; window: DOMWindow }

          • close: () => void
              • (): void
              • Returns void

          • window: DOMWindow
    • element: Element
    • fetchPage: (url: string) => Promise<string>
        • (url: string): Promise<string>
        • Parameters

          • url: string

          Returns Promise<string>

    • geosearch: (q: string, limit?: number) => Promise<GeosearchResult>
    • href: (selector: string) => string
        • (selector: string): string
        • Parameters

          • selector: string

          Returns string

    • src: (selector: string) => string
        • (selector: string): string
        • Parameters

          • selector: string

          Returns string

    • text: (selector: string) => string
        • (selector: string): string
        • Parameters

          • selector: string

          Returns string

Const geosearch

scrape

Const createSelectorUtilities

  • createSelectorUtilities(element: Element): { all: (selector: string) => { asArray: Element[]; nodes: NodeListOf<Element> }; attr: (selector: string, attribute: string) => string; className: (selector: string) => string; createWindowForHTMLContent: (htmlContent: string) => { close: () => void; window: DOMWindow }; element: Element; fetchPage: (url: string) => Promise<string>; geosearch: (q: string, limit?: number) => Promise<GeosearchResult>; href: (selector: string) => string; mapNodeListToArray: (nodeList: NodeList) => Element[]; src: (selector: string) => string; text: (selector: string) => string }
  • +

    This method creates the selector utilities provided +to every selector function given to the scrape method.

    +

    These utilities are meant to make the experience of +using papercut a bit more pleasant. They're currently +not extendable, but one could, in theory, create higher +order functions extension.

    +

    Almost every single one of these methods have a default +fallback of an empty string, in case it fails to find the +element or a specific property.

    +

    At the same time, you also have direct access to the elementfrom selector functions if needed for more complex tasks.

    +

    Parameters

    • element: Element

    Returns { all: (selector: string) => { asArray: Element[]; nodes: NodeListOf<Element> }; attr: (selector: string, attribute: string) => string; className: (selector: string) => string; createWindowForHTMLContent: (htmlContent: string) => { close: () => void; window: DOMWindow }; element: Element; fetchPage: (url: string) => Promise<string>; geosearch: (q: string, limit?: number) => Promise<GeosearchResult>; href: (selector: string) => string; mapNodeListToArray: (nodeList: NodeList) => Element[]; src: (selector: string) => string; text: (selector: string) => string }

    • all: (selector: string) => { asArray: Element[]; nodes: NodeListOf<Element> }
        • (selector: string): { asArray: Element[]; nodes: NodeListOf<Element> }
        • Parameters

          • selector: string

          Returns { asArray: Element[]; nodes: NodeListOf<Element> }

          • asArray: Element[]
          • nodes: NodeListOf<Element>
    • attr: (selector: string, attribute: string) => string
        • (selector: string, attribute: string): string
        • Parameters

          • selector: string
          • attribute: string

          Returns string

    • className: (selector: string) => string
        • (selector: string): string
        • Parameters

          • selector: string

          Returns string

    • createWindowForHTMLContent: (htmlContent: string) => { close: () => void; window: DOMWindow }
        • (htmlContent: string): { close: () => void; window: DOMWindow }
        • Parameters

          • htmlContent: string

          Returns { close: () => void; window: DOMWindow }

          • close: () => void
              • (): void
              • Returns void

          • window: DOMWindow
    • element: Element
    • fetchPage: (url: string) => Promise<string>
        • (url: string): Promise<string>
        • Parameters

          • url: string

          Returns Promise<string>

    • geosearch: (q: string, limit?: number) => Promise<GeosearchResult>
    • href: (selector: string) => string
        • (selector: string): string
        • Parameters

          • selector: string

          Returns string

    • mapNodeListToArray: (nodeList: NodeList) => Element[]
        • (nodeList: NodeList): Element[]
        • Parameters

          • nodeList: NodeList

          Returns Element[]

    • src: (selector: string) => string
        • (selector: string): string
        • Parameters

          • selector: string

          Returns string

    • text: (selector: string) => string
        • (selector: string): string
        • Parameters

          • selector: string

          Returns string

Const geosearch

scrape

  • the scrape function

    this function will select all target nodes from the given document and spawn promise pools for diff --git a/package.json b/package.json index fbe46dc..5780ae3 100644 --- a/package.json +++ b/package.json @@ -1,5 +1,5 @@ { - "version": "2.0.1", + "version": "2.0.2", "license": "MIT", "main": "dist/index.js", "types": "dist/index.d.ts",