| 622 | There is quite a neat web service called [https://coveralls.io coveralls.io] free for open source projects which graphically displays unit test line coverage in a pretty colour coded source code browser UI. You also get a badge which shows how much of your code is covered as a percentage. It might sound like a bit of a gimmick, but in terms of ease of quickly visualising what you ''haven't'' covered when you thought you should it's very handy. Also, if you hook coveralls into your github using travis, coveralls will comment on your pull requests and commits whether your test coverage has risen or fallen, and that can be more than useful when you send in a commit and an unexpected catastrophic fall in coverage occurs as that probably means you just committed buggy code. |
| 623 | |
| 624 | Anyway, firstly take a look at these libraries which use coveralls.io and decide if you like what you see: |
| 625 | |
| 626 | * https://coveralls.io/r/krzysztof-jusiak/di?branch=cpp14 |
| 627 | * https://coveralls.io/r/BoostGSoC13/boost.afio |
| 628 | |
| 629 | Assuming you are now convinced, firstly you obviously need travis working. You ''can'' use coveralls without travis, but it's a one click enable with travis and github, so we'll assume you've done that. Your next problem with be getting travis to calculate line coverage for you, and to send the results to coveralls. |
| 630 | |
| 631 | There are two approaches to this, and we'll start with the official one. Firstly you'll need a coveralls API key securely encoded into travis, [http://docs.travis-ci.com/user/environment-variables/#Secure-Variables see this page for how]. Next have a look at https://github.com/krzysztof-jusiak/di/blob/cpp14/.travis.yml, with the key line being: |
| 632 | |
| 633 | {{{ |
| 634 | after_success: |
| 635 | - if [ "${TRAVIS_BRANCH}" == "cpp14" ] && [ "${VARIANT}" == "coverage" ]; then (sudo pip install requests[security] cpp-coveralls && coveralls -r . -b test/ --gcov /usr/bin/${GCOV} --repo-token c3V44Hj0ZTKzz4kaa3gIlVjInFiyNRZ4f); fi |
| 636 | }}} |
| 637 | |
| 638 | This makes use of the coveralls c++ tool at https://github.com/eddyxu/cpp-coveralls to do the analysis, and you'll also need to adjust your Jamfile as per https://github.com/krzysztof-jusiak/di/blob/cpp14/test/Jamfile.v2 with some variant addition like: |
| 639 | |
| 640 | {{{ |
| 641 | extend-feature variant : coverage ; |
| 642 | compose <variant>coverage : |
| 643 | <cxxflags>"-fprofile-arcs -ftest-coverage" <linkflags>"-fprofile-arcs" |
| 644 | <optimization>off |
| 645 | ; |
| 646 | }}} |
| 647 | |
| 648 | ... which gets the gcov files to be output when the unit tests are executed. |
| 649 | |
| 650 | That's the official way, and you should try that first. However, I personally couldn't get the above working, though admittedly when I implemented coveralls support it was a good two years ago and I spent a large chunk of it fighting the tooling, so I eventually gave up and wrote my own coveralls coverage calculator which was partially borrowed from others. You can see mine at https://github.com/BoostGSoC13/boost.afio/blob/master/.travis.yml where you will note that I inject the fprofile-arcs etc arguments into b2 via its cxxflags from the outside. I then invoke a shell script at https://github.com/BoostGSoC13/boost.afio/blob/master/test/update_coveralls.sh: |
| 651 | |
| 652 | {{{ |
| 653 | #!/bin/bash |
| 654 | # Adapted from https://github.com/purpleKarrot/Karrot/blob/develop/test/coveralls.in |
| 655 | # which itself was adapted from https://github.com/berenm/cmake-extra/blob/master/coveralls-upload.in |
| 656 | |
| 657 | if [ 0 -eq $(find -iname "*.gcda" | wc -l) ] |
| 658 | then |
| 659 | exit 0 |
| 660 | fi |
| 661 | |
| 662 | gcov-4.8 --source-prefix $1 --preserve-paths --relative-only $(find -iname "*.gcda") 1>/dev/null || exit 0 |
| 663 | |
| 664 | cat >coverage.json <<EOF |
| 665 | { |
| 666 | "service_name": "travis-ci", |
| 667 | "service_job_id": "${TRAVIS_JOB_ID}", |
| 668 | "run_at": "$(date --iso-8601=s)", |
| 669 | "source_files": [ |
| 670 | EOF |
| 671 | |
| 672 | for file in $(find * -iname '*.gcov' -print | egrep '.*' | egrep -v 'valgrind|SpookyV2|bindlib|test') |
| 673 | do |
| 674 | FILEPATH=$(echo ${file} | sed -re 's%#%\/%g; s%.gcov$%%') |
| 675 | echo Reporting coverage for $FILEPATH ... |
| 676 | cat >>coverage.json <<EOF |
| 677 | { |
| 678 | "name": "$FILEPATH", |
| 679 | "source": $(cat $FILEPATH | python test/json_encode.py), |
| 680 | "coverage": [$(tail -n +3 ${file} | cut -d ':' -f 1 | sed -re 's%^ +%%g; s%-%null%g; s%^[#=]+$%0%;' | tr $'\n' ',' | sed -re 's%,$%%')] |
| 681 | }, |
| 682 | EOF |
| 683 | done |
| 684 | |
| 685 | #cat coverage.json |
| 686 | mv coverage.json coverage.json.tmp |
| 687 | cat >coverage.json <(head -n -1 coverage.json.tmp) <(echo -e " }\n ]\n}") |
| 688 | rm *.gcov coverage.json.tmp |
| 689 | |
| 690 | #head coverage.json |
| 691 | #echo |
| 692 | curl -F json_file=@coverage.json https://coveralls.io/api/v1/jobs |
| 693 | #head coverage.json |
| 694 | }}} |
| 695 | |
| 696 | This manually invokes gcov to convert the gcda files into a unified coverage dataset. I then use egrep to include all and egrep -v to exclude anything matching the pattern which is all the stuff not in the actual AFIO library. You'll note I build a JSON fragment as I go into the coverage.json temporary file, and the coverage is generated by chopping up the per line information into a very long string matching the coveralls JSON specification as per its API docs. Do note the separate bit of python called to convert the C++ source code into encoded JSON text (https://github.com/BoostGSoC13/boost.afio/blob/master/test/json_encode.py), I had some problems with UTF-8 in my source code, and forcing them through a ISO-8859 JSON string encode made coveralls happy. I then push the JSON to coveralls using curl. All in all a very blunt instrument, and essentially doing exactly the same thing as the official C++ coveralls tool now does, but you may find the manual method useful if the official tool proves too inflexible for your needs. |
| 697 | |
| 700 | I like all-in-one-place software status dashboards where with a quick glance one can tell if there is a problem or not. I feel it makes it far more likely that I will spot a problem quickly if it is somewhere I regularly visit, and for that reason I like to mount my status dashboard at the front of my library docs and on my project's github Readme: |
| 701 | |
| 702 | * Front of my library docs: https://boostgsoc13.github.io/boost.afio/ |
| 703 | * Project's github Readme (bottom of page): https://github.com/BoostGSoC13/boost.afio |
| 704 | |
| 705 | Implementing these is ridiculously easy: it's a table in standard HTML which github markdown conveniently will render as-is for me, and you can see its source markdown/HTML at https://raw.githubusercontent.com/BoostGSoC13/boost.afio/master/Readme.md. The structure is very simple, columns for OS, Compiler, STL, CPU, Build status, Test status and with three badges in each status row, one each for header only builds, static library builds, and shared DLL builds. |
| 706 | |
| 707 | Keen eyes may note that the latter majority of that HTML looks automatically generated, and you would be right. The python script at https://github.com/BoostGSoC13/boost.afio/blob/master/scripts/GenJenkinsMatrixDashboard.py has a matrix of test targets configured on my Jenkins CI at https://ci.nedprod.com/ and it churns out HTML matching those. An alternative approach is https://github.com/BoostGSoC13/boost.afio/blob/master/scripts/JenkinsMatrixToDashboard.py which will parse a Jenkins CI test grid from a Matrix Build configuration into a collapsed space HTML table which fits nicely onto github. If you also want your HTML/markdown dashboard to appear in your BoostBook documentation, the script at https://github.com/BoostGSoC13/boost.afio/blob/master/scripts/readme_to_html.sh with the XSLT at https://github.com/BoostGSoC13/boost.afio/blob/master/scripts/xhtml_to_docbook.xsl should do a fine job. |
| 708 | |
| 709 | All of the above dashboarding is fairly Jenkins centric, so what if you just have Travis + Appveyor? I think Boost.DI has it right by encoding a small but complete status dashboard into its BoostBook docs and github, so examine: |
| 710 | |
| 711 | * Front page of library docs (underneath the table of contents): https://krzysztof-jusiak.github.io/di/cpp14/boost/libs/di/doc/html/ |
| 712 | * Project's github Readme (bottom of page, look for the badges): https://github.com/krzysztof-jusiak/di |
| 713 | |
| 714 | As a purely personal thing, I'd personally prefer the line of status badges ''before'' the table of contents such that I am much more likely to see it when jumping in and notice that something is red when it shouldn't be. But it's purely a personal thing, and each library author will have their own preference. |
| 715 | |
| 716 | Finally, I think that displaying status summaries via badges like this is another highly visible universal mark of software quality. It shows that the library author cares enough to publicly show the current state of their library. Future tooling by Boost which dashboards Boost libraries and/or ranks libraries by a quality score will almost certainly find the application specific ids for Travis, Appveyor, Coveralls etc by searching any Readme.md in the github repo for status badges, so by including status badges in your github Readme.md you can guarantee that such Boost library ranking scripts will work out of the box with no additional effort by you in the future. |
| 717 | |
| 723 | |
| 724 | todo check this compiles |
| 725 | |
| 726 | https://en.wikipedia.org/wiki/Policy-based_design |
| 727 | |
| 728 | {{{ |
| 729 | #include <iostream> |
| 730 | #include <string> |
| 731 | |
| 732 | template <typename OutputPolicy, typename LanguagePolicy> |
| 733 | class HelloWorld : private OutputPolicy, private LanguagePolicy |
| 734 | { |
| 735 | using OutputPolicy::print; |
| 736 | using LanguagePolicy::message; |
| 737 | |
| 738 | public: |
| 739 | // Behaviour method |
| 740 | void run() const |
| 741 | { |
| 742 | // Two policy methods |
| 743 | print(message()); |
| 744 | } |
| 745 | }; |
| 746 | |
| 747 | class OutputPolicyWriteToCout |
| 748 | { |
| 749 | protected: |
| 750 | template<typename MessageType> |
| 751 | void print(MessageType const &message) const |
| 752 | { |
| 753 | std::cout << message << std::endl; |
| 754 | } |
| 755 | }; |
| 756 | |
| 757 | class LanguagePolicyEnglish |
| 758 | { |
| 759 | protected: |
| 760 | std::string message() const |
| 761 | { |
| 762 | return "Hello, World!"; |
| 763 | } |
| 764 | }; |
| 765 | |
| 766 | class LanguagePolicyGerman |
| 767 | { |
| 768 | protected: |
| 769 | std::string message() const |
| 770 | { |
| 771 | return "Hallo Welt!"; |
| 772 | } |
| 773 | }; |
| 774 | |
| 775 | int main() |
| 776 | { |
| 777 | /* Example 1 */ |
| 778 | typedef HelloWorld<OutputPolicyWriteToCout, LanguagePolicyEnglish> HelloWorldEnglish; |
| 779 | |
| 780 | HelloWorldEnglish hello_world; |
| 781 | hello_world.run(); // prints "Hello, World!" |
| 782 | |
| 783 | /* Example 2 |
| 784 | * Does the same, but uses another language policy */ |
| 785 | typedef HelloWorld<OutputPolicyWriteToCout, LanguagePolicyGerman> HelloWorldGerman; |
| 786 | |
| 787 | HelloWorldGerman hello_world2; |
| 788 | hello_world2.run(); // prints "Hallo Welt!" |
| 789 | } |
| 790 | }}} |