blog

PapaParse as a Promise

I’m torso deep in re-writing the Booked front end for the 3.0 release. One of the big improvements will be the incorporation of modern JavaScript tools and patterns to replace the clunky JavaScript that currently handles client side functionality and dynamic rendering.

Most of the admin tools allow for CSV imports for data. I’m now using PapaParse to handle client side parsing of this information. But I really dislike the API where you have to provide callbacks for complete and error methods. So I made it into a Promise. This example is using TypeScript, but is easily translated.

export async function parseCsv(file: File): Promise> {
  return new Promise((resolve, reject) => {
    Papa.parse(file, {
      header: true,
      skipEmptyLines: true,
      transform: (value: string): string => {
        return value.trim();
      },
      complete: (results: ParseResult) => {
        return resolve(results);
      },
      error: (error: ParseError) => {
        return reject(error);
      },
    });
  });
}

Which means I can now just call

const parsedCSVFile = await parseCsv(file);

Test driven development when I have no idea how to test anything

It’s no secret that I’m a firm believer in the effectiveness of test-driven development. But I don’t practice TDD 100% of the time. One of those times is when I’m just getting started with a new technology or library.

For me, one of the hardest aspects of TDD is writing a test when I have no idea what code I want to test. This was a major hurdle for me as I began working with JavaScript, React and the rest of that stack. I simply had no idea how to even write a test.

Spikes are great for this, of course. Go off for a few hours, play around, do research, and come back with a better sense of what you’re dealing with. But still, writing that first test can be paralyzing.

What I realized during my initial few weeks of working with a new tech stack is that I did, in fact, know what test I wanted to write. I just didn’t know how to set the test up, execute it, or validate the results. Pretty much every aspect of TDD 🙂

My “tests”

Let’s take a ubiquitous example to illustrate what I ended up doing – product search.

Syntax and usage of the actual test infrastructure are irrelevant for now – an implementation detail that I could deal with later as I learned more.

So I opened up a text editor and created a file called “product-search-tests.txt”

And I added the first test name. This part is easy and pretty much framework-agnostic anyway. “When search runs, and matching products exist for the given search term, return the matching products”.

Then the preconditions that I needed. “Product named keyboard is in stock. Product named keypad is out of stock. Product named fuzzy bunny is in stock.”

Of course, then the execution. Again, this was simple. “Run product search with search term key

Then the post-conditions. “Search results contain keyboard and do not contain keypad or fuzzy bunny

That was it. Text in notepad++. No code. No frameworks. Heck, no IDE.

Unsticking myself

Doing this made me realize I didn’t need to care about how a React app was structured. I didn’t get caught up in the details of how to set up an in-memory product catalog (do I use redux, context, something else?) or how to query the results (do I need a button click event handler, what about an API?).

I was able to completely focus the behavior.

That was enough for me to really get going. I ended up writing a bunch of “tests” for a lot of different scenarios. Ignoring the technology removed all of those constraints and kept my brain from trying to solve problems that may not even exist.

Making the tests real

Of course, a text document isn’t a test suite. It’s pretty much just a list of requirements. But this gave me a blueprint for the real test implementation. I was able to gradually translate each test into code. Of course, some tests changed. Some dropped off completely. I added others. As the code materialized, I split some tests out to more focused areas.

The appeal of test-after development

I’ve recommended to some people that I’ve mentored over the years to write their first tests with a pencil and a notepad. Developers typically want to explore the code and figure out the details of the test before they get started. This way they “know” what to test, so it’s “easier”.

I think this is the appeal with the more common test-after approach to unit testing. Here, you can figure everything out, run your manual tests to ensure it all works, then go back and write unit tests that end up proving that the code does what the developer made it do.

I think it’s been shown more than enough times that this approach is great for hitting code coverage metrics, but terrible for actually growing a code base organically through small steps. Much has been written about the fragility of test-after unit tests, so I won’t add redundancy here.

Is this really TDD?

This is absolutely not red-green-refactor, “by the book” TDD. But I don’t really care because I think this captures the intent of TDD. I think about the next chunk of behavior needed, write a test to validate a small increment of progress, and move on to the next increment. Of course, this approach defers the “validation” step until a better understanding of the problem emerges, but I’m OK with that.

Machine Readable Logs in PHP

I’m a big believer that log files should be easy for machines to read and ingest. Structured logging opens up the door for using tools like the ELK stack or Splunk to aggregate, search, monitor, and alert on application activity.

The most common format for structured logging seems to be JSON. There are, of course, a lot of logging libraries for PHP, but I’ve been using log4php forever and don’t have any plans to switch. Like most logging libraries, log4php logs straight up strings.

There is no built in appender for JSON log formatting, though. The good news is that it’s easy to get properly formatted JSON logs using the standard LoggerAppenderDailyFile.

Here’s what I did.

First, I updated my log4php.config.xml file to write a JSON structure for the LoggerLayoutPattern. My updated appender looks like this. (For reference, all of log4php’s conversion parameters can be found here)

<appender name="jsonAppender" class="LoggerAppenderDailyFile">
   <layout class="LoggerLayoutPattern">
      <param name="conversionPattern" value="{&quot;timestamp&quot;:&quot;%d&quot;, &quot;level&quot;:&quot;%p&quot;, &quot;ip&quot;:&quot;%server{REMOTE_ADDR}&quot;, &quot;details&quot;:%m}%n"/>
   </layout>
   <param name="file" value="/tmp/application_json_%s.log"/>
   <param name="append" value="true"/>
</appender>

Let’s break the conversion pattern down.

We open it with a bracket { and close it with another bracket followed by a newline }%n

This is just the basic JSON data structure.

Next we add a timestamp. This is where it gets a little funky. We have to encode the quotes because the config file is XML. %d the parameter is the standard ISO8601 datetime format.

&quot;timestamp&quot;:&quot;%d&quot;

This is equivalent to a JSON object that looks like {“timestamp”:”%d”}

The rest is pretty much the same.

{&quot;timestamp&quot;:&quot;%d&quot;, &quot;level&quot;:&quot;%p&quot;, &quot;ip&quot;:&quot;%server{REMOTE_ADDR}&quot;, &quot;details&quot;:%m}%n

This creates a JSON object that looks like this

{“timestamp”:”%d”, “level”:”%p”, “ip”:”%server{REMOTE_ADDR}”, “details”:%m}%n

Next we need to make sure that the log information is written as JSON instead of plain text. I have a standard static Log class that I use for all logging.

class Log
{
    /**
     * @var Log
     */
    protected static $_instance;

    /**
     * @var Logger
     */
    protected $logger;

    protected function __construct()
    {
        Logger::configure("log4php.config.xml");
        $this->logger = Logger::getLogger('default');
    }

    /**
     * @return Log
     */
    private static function &GetInstance()
    {
        if (is_null(self::$_instance)) {
            self::$_instance = new Log();
        }

        return self::$_instance;
    }

    /**
     * @param string $message
     * @param array $args
     * @return LogMessage
     */
    private static function EnrichLog($message, $args)
    {
        $logMessage = new LogMessage($message, $args);
        $debug = debug_backtrace(DEBUG_BACKTRACE_IGNORE_ARGS);
        if (is_array($debug)) {
            $debugInfo = $debug[1];
        }
        else {
            $debugInfo = array('file' => null, 'line' => null);
        }

        $logMessage->userId = $_SESSION['userId'];
        $logMessage->file = $debugInfo['file'];
        $logMessage->line = $debugInfo['line'];
        return $logMessage;
    }

    /**
     * @param string $message
     * @param array $args
     */
    public static function Debug($message, $args = array())
    {
        if (!self::GetInstance()->logger->isDebugEnabled()) {
            return;
        }

        try {
            $log = json_encode(self::EnrichLog($message, $args));
            self::GetInstance()->logger->debug($log);
        } catch (Exception $ex) {
            echo $ex;
        }
    }

    /**
     * @param string $message
     * @param array $args
     */
    public static function Info($message, $args = array())
    {
        if (!self::GetInstance()->logger->isInfoEnabled()) {
            return;
        }

        try {
            $log = json_encode(self::EnrichLog($message, $args));
            self::GetInstance()->logger->info($log);
        } catch (Exception $ex) {
            echo $ex;
        }
    }

    /**
     * @param string $message
     * @param array $args
     */
    public static function Warn($message, $args = array())
    {
        if (!self::GetInstance()->logger->isWarnEnabled()) {
            return;
        }

        try {
            $log = json_encode(self::EnrichLog($message, $args));
            self::GetInstance()->logger->warn($log);
        } catch (Exception $ex) {
            echo $ex;
        }
    }

    /**
     * @param string $message
     * @param array $args
     * @param Exception|null $exception
     */
    public static function Error($message, $args = array(), $exception = null)
    {
        if (!self::GetInstance()->logger->isErrorEnabled()) {
            return;
        }

        try {
            $logMessage = self::EnrichLog($message, $args);
            $logMessage->exception = $exception;
            $log = json_encode($logMessage);
            self::GetInstance()->logger->error($log);
        } catch (Exception $ex) {
            echo $ex;
        }
    }

    /**
     * @param string $message
     * @param array $args
     * @param Exception|null $exception
     */
    public static function Fatal($message, $args = array(), $exception = null)
    {
        if (!self::GetInstance()->logger->isFatalEnabled()) {
            return;
        }

        try {
            $logMessage = self::EnrichLog($message, $args);
            $logMessage->exception = $exception;
            $log = json_encode($logMessage);
            self::GetInstance()->logger->fatal($log, $args);
        } catch (Exception $ex) {
            echo $ex;
        }
    }

    public static function SetInstance($logger)
    {
        self::$_instance = $logger;
    }
}

class LogMessage
{
    /**
     * @var string
     */
    public $message;

    /**
     * @var int|null
     */
    public $userId;

    /**
     * @var string|null
     */
    public $file;

    /**
     * @var string|null
     */
    public $line;

    /**
     * @var array|null
     */
    public $args;
	
    /**
     * @var Exception|null
     */
    public $exception;

    public function __construct($message, $args = null)
    {
        $this->message = $message;
        $this->args = $args;
    }
}

Using this is pretty simple, we just need to make sure we pass in any variables we want in a format that converts to JSON easily. I use an array because it’s readable and lightweight.

Log::Debug('This is a log message', ['param1' => 'value1', 'param2' => 123]);

Would end up writing a log message that looks like this

{"timestamp":"2020-09-18T10:17:16-04:00", "details":"{"message":"This is a log message","userId":1,"file":"/var/www/app/test-logger.php","line":71,"args":{"param1":"value1", "param2": 123},"exception":null}", "level": "DEBUG"}

This log can easily be parsed and searched using ElasticSearch now. This makes it trivial to find specific log events and look at trends.

To take it a step further, I could create a correlationId that is stored in the session at the beginning of a web request and added to every log entry. Now I can connect all the log events for every request.

Node – Get Named Command Line Arguments

I am writing a small command line utility using Node and needed to get named arguments from the command line. A quick Google search led me down crazy complicated rabbit holes and (of course) a bunch of recommendations to just install npm modules.

There ain’t no way I’m installing an npm module to READ COMMAND LINE ARGUMENTS!

So I wrote a tiny function that gets command line arguments and turns them into an object. 10 lines is all you need in ES6.

function getArgs() {
  const args = process.argv.slice(2);
  let params = {};

  args.forEach(a => {
    const nameValue = a.split("=");
    params[nameValue[0]] = nameValue[1];
  });

  return params;
}

Easy enough. Now if I call the script from the command line like this:

node myapp.js arg1=foo arg2=bar

I can transform the arguments to an object by calling:

const args = getArgs();

Which will give me the following object:

{
arg1: "foo",
arg2: "bar"
}

So now I can just do this:

// called using node myapp.js name=nick role="master of node"

const args = getArgs();
console.log(`${args.name} is the ${args.role}`);

// which outputs "nick is the master of node"

I mean, that’s all there is to it. I’ll save my rant about the wasteland that is npm for another post.

Configuring Booked SSO with SAML

Booked comes with multiple Single Sign On plugins out of the box. There are many benefits to SSO over standard authentication. For administrators, having a single point of account credential and access administration is very valuable. If someone leaves the organization they don’t have to deactivate accounts in multiple systems. For your normal user, the benefit is not having to register and remember yet another set of application credentials.

In this post we’ll cover how to set up SSO with SAML.

Most SSO configurations for Booked are pretty straightforward – you just update the configuration options for the plugin. But SAML is different. SAML requires a 3rd party application called SimpleSAMLphp to be running on the same server as Booked.

Install SimpleSAMLphp

Our first step is to download the latest version of SimpleSAMLphp and install it on your web server. I recommend installing it outside of your publicly visible directories and set up a subdomain pointing to the www directory.

For example, if you install it to /home/username/simplesamlphp and you have Booked running out of /home/username/public_html/booked, then you’d create a subdomain such as saml.bookedscheduler.com pointing to /home/username/simplesamlphp/www. The reason we do this is because the only files which need to be publicly visible in SimpleSAMLphp are located in the www directory. Exposing more than that opens unnecessary security holes.

Configure SimpleSAMLphp

SimpleSAMLphp has a lot of configuration options. If you’re like me and far from an expert in SAML, it’s overwhelming. Luckily, since Booked is a Service Provider it doesn’t need anything special.

I’ll go through each of the settings that need to be updated individually. Please note that at the time of writing this post, the latest version of SimpleSAMLphp was 1.18.5. It’s possible that the names of the options will change in future versions.

Open up home/username/simplesamlphp/config/config.php with a text editor.

baseurlpath should be updated to the full path of the SimpleSAMLphp www directory. If you followed the above advice and created a subdomain, this should be something like https://saml.yourdomain.com

technicalcontact_email should be set to your email address (or anyone responsible for managing SSO integrations)

secretsalt should be set to any secure, random value.

auth.adminpassword should be set to any secure, random value.

trusted.url.domains should be set to an array of domains that will participate in the SSO handshake. I use array(‘saml.bookedscheduler.com’, ‘bookedscheduler.com’)

session.cookie.domain should be set to the wildcard subdomain of your primary domain. For example, I use .bookedscheduler.com

session.cookie.secure should be set to true, assuming all traffic is sent over https.

store.type should be set to sql. This ensures that PHP sessions from Booked and sessions from SimpleSAMLphp do not conflict.

store.sql.dsn should be set to a writable location for the sqlite database. You must have SQLite support in PHP enabled for this to work. Alternatively, you can set up any PDO supported database to store session data. Since I use SQLite, I have this set to something like sqlite:/home/username/tmp/sqlitedatabase.sq3

Exchange Metadata

Now that we have the configuration set, we’ll need to exchange metadata.

The first thing to do is get the metadata XML from the Identity Provider that you’re integrating with. SimpleSAMLphp has a handy metadata XML conversion tool, which we’ll use to finish up our configuration.

Open the subdomain for SimpleSAMLphp in a browser (https://saml.bookedscheduler.com was what I used). Click on the Federation tab, then the XML to SimpleSAMLphp metadata converter link. You’ll be prompted to enter the auth.adminpassword that you set in your config.php

Paste in the XML or, if you have it saved to a file, upload it. SimpleSAMLphp will output at least one PHP version of that metadata.

For each one of those, location the file with the same name in /home/username/simplesamlphp/metadata. The most common files to update will be saml20-idp-remote.php or shib13-idp-remote.php

Delete everything except the opening php tag, then paste in the output from SimpleSAMLphp.

Copy the value of the entityid (usually found on the 3rd line of that file) and open up /simplesamlphp/config/authsources.php. Find the idp setting, and paste the value of the entityid there.

Update SAML Configuration in Booked

Whew, almost done. The last few settings are in Booked.

First, open up /your-booked-directory/config/config.php, find the authentication setting in the plugins section and set the value to Saml.

$conf['settings']['plugins']['Authentication'] = 'Saml';

Open up /your-booked-directory/plugins/Authentication/Saml and copy Saml.config.dist.php to Saml.config.php. Open Saml.config.php in an editor.

simplesamlphp.lib should be updated to the root filesystem directory of SimpleSAMLphp. If you’re using the settings I described here, this would be /home/username/simplesamlphp.

simplesamlphp.config should be updated to the config filesystem directory for SimpleSAMLphp. In this case /home/username/simplesamlphp/config

Most of the remaining settings are attribute maps. SAML will send over user attributes, but often with obscure names. Booked needs to know which attribute maps to the proper user field in Booked.

There are only 2 absolutely required fields to map – username/userid and email. For example, if the username is being sent across in the SAML payload as urn:oid:0.1.2.3 you’d set simplesamlphp.username to this value like $conf[‘settings’][‘simplesamlphp.username’] = ‘urn:oid:0.1.2.3’;

This is the same for all the other attributes. If you don’t know the attributes coming across then you can add the following line to plugins/Authentication/Saml/SamlUser.php as the first line in the constructor.

Log::Debug('Saml attributes are: %s', var_export($saml_attributes, true));

Enable Logging in Booked and try to log in. We’ll write out the attributes to the log file and you can copy the names into the Booked SAML configuration file.

Some Restrictions

A couple important notes with SAML enabled. The first is that you will no longer be able to log into Booked with any other credentials. There is no “back door” – so every authentication request will be routed through SAML.

The other restriction is that you will not be able to use any authenticated method from the API. SAML performs a series of browser redirects in order to complete the authentication process. When using the API you are not within the context of a browser, so authentication will fail.

Logging In

Once all the mapping is complete, you should be able to log into Booked via your organization’s federated log in page. Your users will no longer have to remember another set of credentials and your account management just got one step easier.

Social Distancing Features in Booked

Slowing the spread of COVID-19 is about the most important thing we can do as a society right now. I know that Booked is heavily used in laboratories and other organizations where working remotely may not be possible. I want to highlight a few features that can help reinforce social distancing in your lab or organization.

Capacity

The most obvious one is capacity limits for resources. For any resource that is designed for a group of people (conference room, lab bench) you can set the maximum capacity to something small. This prevents more than a set number of people from participating in that reservation.

Buffer Times

Buffer times are perfect for enforcing time between reservations. This can assist with cleaning and disinfecting or simply help space people out and prevent unintended contact.

Maximum Reservation Duration

This is a simple rule that will prevent reservations over a certain duration. Combining this with capacity constraints can help limit people’s interactions.

Quotas

Quotas are typically used to prevent over-booking by a single individual, but they can also be used to keep resources under-booked, reducing the sanitation burden.

You can also use quotas to ensure that resources are available to as many people as possible. For example, many people use Booked for appointment scheduling. If you have a COVID-19 response team or internal health team, this could be a way to make sure they’re available to as many people as possible.

Hosting and Support

Booked continues to be open source and completely free to anyone that wants it. You can download it here.

I have been offering hosting services for many years, which completely removes the burden of installation and support from your internal team. The first 30 days of hosting are free, with no absolutely no obligation or commitment of any kind.

I know that business is unpredictable right now and placing orders for new services is far down the list of priorities. I’m happy to work with your team on free trial extensions and flexible billing terms until things are back to normal.

Configuring Sonar with a Create React App in TypeScript

There are a ton of posts on StackOverflow and Medium and the rest of the internet on setting up SonarQube, but I couldn’t find a definitive guide on configuring it with a React web application (using react-scripts/create react app) written in TypeScript. Turns out that it’s not that hard once you know all the pieces that need to be pulled together.

This article was written in February, 2020. These are the versions of the different components I’m using. Your mileage may vary.

  • Yarn 1.2.1
  • Node 12.4.1
  • React 16.12.0
  • Sonar 7.9.2
  • TypeScript 3.7.5
  • Docker Desktop (Windows) 2.2.0.0
  • Docker 19.03.5
  • react-scripts 3.3.0
  • sonarqube-scanner 2.5.0
  • jest-sonar-reporter 2.0.0

First of all, I’m going to assume nothing is ejected from the CRA configuration and that you have a working application written in TypeScript. For example if you cannot successfully run npm test or yarn test then this guide will not work. I’m also assuming you’re comfortable with Docker.

SonarQube

Get SonarQube up. I used Docker for this because it was the quickest way.

First, set your memory allocation for Docker to at least 4GB for the Sonar container to be able to run correctly. ElasticSearch needs this, I guess.

Next, I used this docker-compose file from https://github.com/SonarSource/docker-sonarqube/blob/master/example-compose-files/sq-with-postgres/docker-compose.yml. Just docker-compose up. This will start a SonarQube instance using Postgres on http://localhost:9000

Open that URL in a browser, log in with admin/admin, and make sure everything is looking good. There should be a wizard that guides you through creating a security token for your user. If not, you can always do so in Administration > Security and creating a Sonar user just for project analysis (this is probably a good idea anyway).

Analyzing your React application with Sonar

Tools

You’ll need a couple tools to simplify this process. Install sonarqube-scanner and jest-sonar-reporter into your React project using either yarn or npm. Save it as a development dependency.

yarn add -D sonarqube-scanner
yarn add -D jest-sonar-reporter

Sonar Configuration

I was expecting to do a bunch of configuration within Sonar itself, but it can all be contained within your React project. In the root of your project (at the same level as your package.json) create a file named sonar-project.js with the following contents. I’ll go over the details next.

const sonarqubeScanner = require("sonarqube-scanner");
sonarqubeScanner(
  {
    serverUrl: "http://localhost:9000",
    token: "YOUR-TOKEN-HERE",
    options: {
      "sonar.sources": "./src",
      "sonar.exclusions": "**/__tests__/**",
      "sonar.tests": "./src/__tests__",
      "sonar.test.inclusions": "./src/__tests__/**/*.test.tsx,./src/__tests__/**/*.test.ts",
      "sonar.typescript.lcov.reportPaths": "coverage/lcov.info",
      "sonar.testExecutionReportPaths": "reports/test-report.xml",
    },
  },
  () => {},
);

Each of these settings is a standard Sonar configuration property, but they weren’t immediately clear to me.

  • serverUrl is the URL to your SonarQube instance
  • token is the security token assigned to your Sonar user
  • sonar.sources is the base directory for all of your code. This is where your React application lives (in my case the *.tsx files). By default, CRA puts __tests__ within the src directory. We’ll deal with that next.
  • sonar.exclusions is everything you do not want Sonar to analyze. The most important one for me is that we don’t want to be analysis on our tests. In fact, if there is overlap between sonar.sources and sonar.tests then Sonar will throw an indexing error. So I exclude anything in any __tests__ folder.
  • sonar.tests is the location of all of your tests. By default, CRA puts this in /src/__tests__
  • sonar.test.inclusions is a comma separated list of all files that should be treated as test files. I have a mix of .tsx and .ts tests, but they all follow the standard jest pattern of testname.test.ts*
  • sonar.typescript.lcov.reportPaths is the path to the test coverage output file from jest. By default this will be coverage/lcov.info
  • sonar.testExecutionReportPaths is the path to the jest-sonar-reporter output file. We’ll configure this next.

Test Coverage Configuration

The first thing to note is that we use the sonar.typescript.lcov.reportPaths property in our sonar-project.js configuration, not the javascript property.

By default jest-sonar-reporter outputs a file called test-report.xml to the root directory. I don’t like littering that directory with unrelated files, so I added this to the end of my package.json file in order to put the report in a reports directory.

"jestSonar": {
  "reportPath": "reports",
  "reportFile": "test-report.xml",
  "indent": 4
}

The last thing you need to do is tell react-scripts to use this test report generator rather than the default. I assume you have something like this in your package.json scripts definition.

"test": "react-scripts test --silent --env=jsdom"

We need to change that so we process the results in a Sonar-friendly format

"test": "react-scripts test --silent --env=jsdom --coverage --testResultsProcessor jest-sonar-reporter"

To be honest, I do one more thing so that we’re not watching tests, so my test script is, which tells react-scripts that we’re running in a continuous integration mode and should not watch.

"test": "cross-env CI=true react-scripts test --silent --env=jsdom --coverage --testResultsProcessor jest-sonar-reporter",

Running Sonar Analysis

We’ll add another script to our package.json file to initiate the Sonar analysis.

"sonar": "node sonar-project.js"

At this point we have all of our configuration done and just need to run everything.

First run your tests

yarn test

This should run all of your tests with coverage. You’ll have files in /coverage and /reports.

Next, run your sonar analysis

yarn sonar

This will take a couple minutes, depending on the size of your project. But when it’s complete, refresh your Sonar projects and you should see your project show up. By default, it will use the name defined in your package.json file.

Continuous Integration

If you wanted to (and why wouldn’t you) integrate this into your CI environment so every push triggers a Sonar analysis, you can just add a build step that invokes yarn sonar after your test stage.

Other Projects

I’m using this base set of instructions for all of my TypeScript based projects. The only thing that really changes is the location of source, tests, and the inclusion/exclusion list. For example, in my api, where I have __tests_ at the same level as src (instead of nested within), my sonar-project.js looks like this.

const sonarqubeScanner = require("sonarqube-scanner");

sonarqubeScanner(
  {
    serverUrl: "http://localhost:9000",
    token: "MY-TOKEN",
    options: {
      "sonar.sources": "./src",
      "sonar.tests": "./__tests__",
      "sonar.test.inclusions": "./__tests__/**/*.test.ts",
      "sonar.typescript.lcov.reportPaths": "coverage/lcov.info",
      "sonar.testExecutionReportPaths": "reports/test-report.xml",
    },
  },
  () => {},
);

Some Cleanup

This whole thing will create some cruft in your project. You want to ignore /.scannerwork, /coverage, /reports from Git.

Booked Tips: Limiting Resource Usage

Booked is configurable in so many ways. In this article we’ll review a few ways to control when, how, and by whom resources are booked.

Let’s start with some of the simple settings. In Application Management > Resources you are able to control broad settings on when resources can be reserved.

Resource Access Settings

The Access section of each resource lets you control the following: How far in advance a resource can be booked, how far in advance an existing reservation for that resource can be updated, and how far in advance an existing reservation can be deleted.

By default, this is all unrestricted. To set any of these values, uncheck and set the notification time. For example, to force a 6 hour lead time on all reservations you would use the following setting. Any attempt to book this resource within 6 hours of the start time will be denied.

To limit how far into the future a reservation can be made, you would use the something like the following setting. This would prevent any reservations more than 30 days in the future.

Resource Duration Settings

To limit (or force) how long a reservation must be, you can change the minimum and maximum duration settings. For example, the following settings would force reservations to be at least 4 hours but no more than 8 hours in duration.

Quotas

Quotas are a powerful, though somewhat complex, way to control resource usage. Using Quotas you can restrict usage based on cumulative time booked or cumulative number of reservations over a given period of time. To get started, navigate to Application Management > Quotas.

So if you wanted to limit users of a specific group to only be able to book 5 hours per week for a specific resource, you can set up that quota rule.

There are few more advanced features.

By default a rule will include past reservations, but you can ignore anything in the past if you want. So let’s use that same 5 hours per week rule as above and assume today is Wednesday. If we’re including past reservations and I had an hour booked on Monday and Tuesday each, I would only be able to book 3 more hours this week. If we’re ignoring past reservations, then I can keep booking up to 5 more hours for the rest of this week.

You can also enforce quotas on only certain days or times. This is especially helpful if certain resources tend to be very popular at peak times. So if you wanted to only allow people to book 30 minute reservations between 10am and 2pm on weekdays, you would set up something like this.

Quota rules are cumulative, as well, so you can “stack” them. Meaning you can limit people to no more than 10 hours per week and no more than 2 hours per day. It’s a very powerful way to control how much time people can reserve.

Hosting and Support

Did you know that I offer professional hosting and support for Booked? You can set up a free trial in minutes and get unlimited support.

This article was written on November 20, 2019, so check your documentation for the latest options.

Booked Tips: Showing Events on Your Website

Booked includes a lot of features that you may not know about. In this article we’ll talk about a simple way to display reservations from Booked on your website.

If you just want to display reservations from Booked to your guests without needing them to navigate to the application and browse the schedule, there is a simple JavaScript snippet you can include on your website.

The first thing you’ll need to do is make schedules or resources public. Anything that has not been marked as public will not be displayed.

The next task is a little bit more technical. Servers have a security feature that prevent loading JavaScript from different domains (Cross Origin Resource Sharing). You’ll need to tell your web server that it’s OK to serve this content. In Apache you can accomplish this by adding the following line to your .htaccess file.

Header Set Access-Control-Allow-Origin "*"

Once that’s done you’re ready to add the snippet to your website. If you open Help and go to the section titled “Embedding a Calendar Externally” we give you the full snippet. It will look something like this.

<script async src="https://demo.bookedscheduler.com/Web/scripts/embed-calendar.js" crossorigin="anonymous"></script>

You can drop this anywhere in your HTML body content and we’ll load reservations using the default snippet settings.

The snippet displays a very simple HTML component that isn’t the prettiest thing to look at.

My feeling is that you’ll want to fit the component in with your website’s theme, so you can take advantage of the CSS classes we provide to style it however you like. All of the CSS class names start with booked-

Customizing the Contents

There are multiple options you can include on the script to customize what’s shown.

NamePossible ValuesDefaultDetails
typeagenda, week, monthagendaControls the view that is shown
formatdate, title, user, resourcedateControls the information shown in the reservation box. Multiple options can be passed. For example, to show date and title request date,title
dAny digit between 1 and 307Limits the number of days shown for the agenda view
sidAny schedule public IDAll schedulesLimits the reservations shown to a specific schedule
ridAny resource public IDAll resourcesLimits the reservations shown to a specific resource

For example, to show the month view for schedule 123 and the date and title for reservations, you would use the following.

<script async src="https://your-booked-url/Web/scripts/embed-calendar.js?type=month&sid=123&format=date,title" crossorigin="anonymous"></script>

Hosting and Support

Did you know that I offer professional hosting and support for Booked? You can set up a free trial in minutes and get unlimited support.

This article was written on November 15, 2019, so check your documentation for the latest options.