So is the goal here simply to duplicate the vtests using SVG? Or is there additional automation can be applied to the existing process? If you're already automating the diff reporting, what more does SVG actually offer you? Yes, it would test both the PNG and SVG export processes, but relative to automation: I don't see the difference between running a diff between two PNG files vs. two text files. Am I missing something?

If it's just duplicating the existing vtests to use SVG, that shouldn't be too hard.

--
Sideways

On 4/26/2016 2:06 PM, Marc Sabatella wrote:
vtests themselves are all created manually - the source files and the reference graphics. The tests are run via the "gen" script in the vtest folder - that's what actually generates the tests files and diffs them against the reference. Go to vtest and type ./gen (you might need to configure some things first so the script uses the right version of MuseScore) and you'll see the tests being run; at the end an HTML file is generated that you can inspect manually.

On Tue, Apr 26, 2016 at 1:36 PM Sideways Skullfinger <[email protected] <mailto:[email protected]>> wrote:

    On 4/22/2016 12:01 PM, Lasconic wrote:
    > We don't have any SVG tests suite, but it's a good idea, we could
    > modify the vtests to render SVG too (or instead of PNG?)
    > (If you don't know what the vtests are, check the vtests
    directory and
    > http://vtest.musescore.org/index.html)
    >
    > Diffing SVG is easier so we could even use them for regression
    testing.
    I now understand this a bit better, and want to understand it fully in
    order to make it happen.  I assume the goal would be to add SVG
    exports
to the mtests, and leave the PNG vtests around, at least for now. But I
    know very little about how all this is set up.

    I now know that the automated testing is setup in the mtests
    folder.  I
    am assuming that the vtest files are generated automatically in mtests
    somewhere, and I would need to duplicate that or add an SVG export of
    the same scores.  Correct?

    Then there is the automated diff process and reporting on differences
    encountered.  Is there an existing way that this is done in MuseScore?

    Those are my questions for now, I'm sure I'll have more as I get
    into it...

    --
    Sideways


------------------------------------------------------------------------------
Find and fix application performance issues faster with Applications Manager
Applications Manager provides deep performance insights into multiple tiers of
your business applications. It resolves application problems quickly and
reduces your MTTR. Get your free trial!
https://ad.doubleclick.net/ddm/clk/302982198;130105516;z
_______________________________________________
Mscore-developer mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/mscore-developer

Reply via email to