Codegen from Yadda features
Last updated a year ago by pateketrueke .
MIT · Repository · Bugs · Original npm · Tarball · package.json
$ cnpm install bdd-tc 
SYNC missed versions from official npm registry.


NPM version travis-ci

Transforms Yadda features into working Testcafé tests.

How it works?

First we need a feature-file, say ./e2e/features/demo.feature:

Feature: Some description

Scenario: Perform a single action

  Given one step
  When I run another step
  Then I test for something specific

Now we must define some steps to cover it, e.g. ./e2e/steps/demo.js:

import { Selector } from 'testcafe';

export default {
  'Given one step': () => async t => {
    await t

  'When I run another step': () => async t => {
    await t

  'Then I test for $phrase': value => async t => {
    await t

Finally we can generate the test-files and execute them:

$ bdd-tc e2e/features -- testcafe --color chrome:headless


Steps are labeled functions that receive arguments and return actual test functions.

Those calls are inlined on the generated tests, but its code is actually imported:

import $step0 from '../steps/demo.js';

fixture `Some description`;

test(`Perform a single action`, async t => {
  await $step0[`Given one step`]()(t);
  await $step0[`When I run another step`]()(t);
  await $step0[`Then I test for \$phrase`]("something specific")(t);


Before and after hooks for tests can be defined too.

They're are similar to step functions:

export default {
  before: {
    namedHook: () => async t => {
      // do something

  after: {
    // etc.

  // use @path as input
  url(path = '/') {
    return process.env.BASE_URL + path;

Now you can reference them with @before and @after annotations respectively:

Feature: Some description

Scenario: Perform a single action

  Given one step
  When I run another step
  Then I test for something specific

Depending on the context, beforeEach/afterEach or before/after is used automatically.


Additional $matchers can be defined within steps as follows:

export default {
  matchers: {
    test: '(foo|bar)',
    noMatch: '(?:[^\\s]*)',

  'When ask for $test': test => async t => {
    console.log(test); // foo OR bar

  'Then verify $noMatch': noMatch => async t => {
    console.log(noMatch); // undefined

Captures made from matchers will be passed as arguments, non-matched placeholders will be captured as (.+?) and passed too.

Use (?:<PATTERN>) to omit captured values from matched placeholders.


Built-in annotations are:

  • @xsnapshot — Unique for features, disables any @snapshot from scenarios below
  • @snapshot — Unique for scenarios, it'll take snapshots after each step!
  • @before — Setup before/beforeEach from features and scenarios.
  • @after — Setup after/afterEach from features and scenarios.
  • @only — Append .only on generated fixture/test calls.
  • @skip — Completely omit fixture/test from generated code.
  • @page — Optional pathame, used only if url() is a function
  • @url — Append .page calls on generated fixture/test calls.

Given @snapshost value is passed as takeSnapshot's selector option, so it can be an array, in which case will fallback until one selector matches/exists.

Any other annotation is keept as input-data and passed through invoked hooks.

Multiple values using [ ;,] as separator will be treated as arrays, e.g.


Complex values can be passed as JSON values, e.g.

@arr=["foo", "bar"]
@obj={"baz": "buzz"}
@str="Other value, with commas, etc."

Working with steps

In order to assist you during writing steps, you can leverage on:

  • takeSnapshot(...) — Calls the same method from testcafe-blink-diff
  • useSelectors(obj) — Object containing Selector(...) definitions, can be nested
  • useFixtures(obj) — Object containing any values as fixtures, can be nested
  • getVal(key) — Validate and return value from registered fixtures, see above
  • getEl(key) — Validate and return selector from registered ones, see above
  • $(...) — Shortcut for Selector(...), same options as original call

Working with fixtures

Importing the bdd-tc/matchers module you gain access to:

  • jsf(schema[, options]) — Generate one or many samples from given JSON-Schema1
  • faker[...] — Faker.js instance - see demo
  • chance[...] — Chance.js instance - see docs
  • gen([type[, schema]]) — Generate a sample based on any given type, additional JSON-Schema is applied if given
  • date([step]) — Random Date object, given optional step: seconds, minutes, hours, days, months or years
  • pick(dataset) — Pick any value from given dataset, even work with strings!
  • oneOf(dataset, whereField, fieldValue) — Find any item on dataset that matches field/value
  • number([min[, max]]) — Returns a random number within min/max boundaries
  • randexp(regexp) — Return a string generated from any given RegExp
  • shuffle(dataset) — Copy, randomize and returns any given dataset

1 We're using json-schema-faker under the hood to generate those.


  • npm install — Setup dependencies
  • npm run e2e — Run defined e2e tests
  • npm run test:ci — To run all unit-tests

Inspect the generated results from E2E snapshots:

  • npm run report:ui
  • open generated/index.html

Current Tags

  • 0.0.7                                ...           latest (a year ago)

7 Versions

  • 0.0.7                                ...           a year ago
  • 0.0.6                                ...           a year ago
  • 0.0.5                                ...           a year ago
  • 0.0.4                                ...           a year ago
  • 0.0.3                                ...           a year ago
  • 0.0.2                                ...           a year ago
  • 0.0.1                                ...           a year ago
Maintainers (1)
Today 0
This Week 0
This Month 0
Last Day 0
Last Week 0
Last Month 0
Dependencies (7)
Dev Dependencies (10)
Dependents (0)

Copyright 2014 - 2016 © |