Compare commits
10 commits
461738dab9
...
8bd4f9716b
Author | SHA1 | Date | |
---|---|---|---|
8bd4f9716b | |||
ffca7c7721 | |||
71a0c33ec7 | |||
6bd7bf40da | |||
933ec6f604 | |||
104e102177 | |||
b8f4671657 | |||
12a4e956a9 | |||
5ccd8048c2 | |||
2617cbe507 |
42 changed files with 4466 additions and 1725 deletions
651
LICENSE.md
Normal file
651
LICENSE.md
Normal file
|
@ -0,0 +1,651 @@
|
|||
GNU Affero General Public License
|
||||
=================================
|
||||
|
||||
_Version 3, 19 November 2007_
|
||||
_Copyright © 2007 Free Software Foundation, Inc. <http://fsf.org/>_
|
||||
|
||||
Everyone is permitted to copy and distribute verbatim copies
|
||||
of this license document, but changing it is not allowed.
|
||||
|
||||
## Preamble
|
||||
|
||||
The GNU Affero General Public License is a free, copyleft license for
|
||||
software and other kinds of works, specifically designed to ensure
|
||||
cooperation with the community in the case of network server software.
|
||||
|
||||
The licenses for most software and other practical works are designed
|
||||
to take away your freedom to share and change the works. By contrast,
|
||||
our General Public Licenses are intended to guarantee your freedom to
|
||||
share and change all versions of a program--to make sure it remains free
|
||||
software for all its users.
|
||||
|
||||
When we speak of free software, we are referring to freedom, not
|
||||
price. Our General Public Licenses are designed to make sure that you
|
||||
have the freedom to distribute copies of free software (and charge for
|
||||
them if you wish), that you receive source code or can get it if you
|
||||
want it, that you can change the software or use pieces of it in new
|
||||
free programs, and that you know you can do these things.
|
||||
|
||||
Developers that use our General Public Licenses protect your rights
|
||||
with two steps: **(1)** assert copyright on the software, and **(2)** offer
|
||||
you this License which gives you legal permission to copy, distribute
|
||||
and/or modify the software.
|
||||
|
||||
A secondary benefit of defending all users' freedom is that
|
||||
improvements made in alternate versions of the program, if they
|
||||
receive widespread use, become available for other developers to
|
||||
incorporate. Many developers of free software are heartened and
|
||||
encouraged by the resulting cooperation. However, in the case of
|
||||
software used on network servers, this result may fail to come about.
|
||||
The GNU General Public License permits making a modified version and
|
||||
letting the public access it on a server without ever releasing its
|
||||
source code to the public.
|
||||
|
||||
The GNU Affero General Public License is designed specifically to
|
||||
ensure that, in such cases, the modified source code becomes available
|
||||
to the community. It requires the operator of a network server to
|
||||
provide the source code of the modified version running there to the
|
||||
users of that server. Therefore, public use of a modified version, on
|
||||
a publicly accessible server, gives the public access to the source
|
||||
code of the modified version.
|
||||
|
||||
An older license, called the Affero General Public License and
|
||||
published by Affero, was designed to accomplish similar goals. This is
|
||||
a different license, not a version of the Affero GPL, but Affero has
|
||||
released a new version of the Affero GPL which permits relicensing under
|
||||
this license.
|
||||
|
||||
The precise terms and conditions for copying, distribution and
|
||||
modification follow.
|
||||
|
||||
## TERMS AND CONDITIONS
|
||||
|
||||
### 0. Definitions
|
||||
|
||||
“This License” refers to version 3 of the GNU Affero General Public License.
|
||||
|
||||
“Copyright” also means copyright-like laws that apply to other kinds of
|
||||
works, such as semiconductor masks.
|
||||
|
||||
“The Program” refers to any copyrightable work licensed under this
|
||||
License. Each licensee is addressed as “you”. “Licensees” and
|
||||
“recipients” may be individuals or organizations.
|
||||
|
||||
To “modify” a work means to copy from or adapt all or part of the work
|
||||
in a fashion requiring copyright permission, other than the making of an
|
||||
exact copy. The resulting work is called a “modified version” of the
|
||||
earlier work or a work “based on” the earlier work.
|
||||
|
||||
A “covered work” means either the unmodified Program or a work based
|
||||
on the Program.
|
||||
|
||||
To “propagate” a work means to do anything with it that, without
|
||||
permission, would make you directly or secondarily liable for
|
||||
infringement under applicable copyright law, except executing it on a
|
||||
computer or modifying a private copy. Propagation includes copying,
|
||||
distribution (with or without modification), making available to the
|
||||
public, and in some countries other activities as well.
|
||||
|
||||
To “convey” a work means any kind of propagation that enables other
|
||||
parties to make or receive copies. Mere interaction with a user through
|
||||
a computer network, with no transfer of a copy, is not conveying.
|
||||
|
||||
An interactive user interface displays “Appropriate Legal Notices”
|
||||
to the extent that it includes a convenient and prominently visible
|
||||
feature that **(1)** displays an appropriate copyright notice, and **(2)**
|
||||
tells the user that there is no warranty for the work (except to the
|
||||
extent that warranties are provided), that licensees may convey the
|
||||
work under this License, and how to view a copy of this License. If
|
||||
the interface presents a list of user commands or options, such as a
|
||||
menu, a prominent item in the list meets this criterion.
|
||||
|
||||
### 1. Source Code
|
||||
|
||||
The “source code” for a work means the preferred form of the work
|
||||
for making modifications to it. “Object code” means any non-source
|
||||
form of a work.
|
||||
|
||||
A “Standard Interface” means an interface that either is an official
|
||||
standard defined by a recognized standards body, or, in the case of
|
||||
interfaces specified for a particular programming language, one that
|
||||
is widely used among developers working in that language.
|
||||
|
||||
The “System Libraries” of an executable work include anything, other
|
||||
than the work as a whole, that **(a)** is included in the normal form of
|
||||
packaging a Major Component, but which is not part of that Major
|
||||
Component, and **(b)** serves only to enable use of the work with that
|
||||
Major Component, or to implement a Standard Interface for which an
|
||||
implementation is available to the public in source code form. A
|
||||
“Major Component”, in this context, means a major essential component
|
||||
(kernel, window system, and so on) of the specific operating system
|
||||
(if any) on which the executable work runs, or a compiler used to
|
||||
produce the work, or an object code interpreter used to run it.
|
||||
|
||||
The “Corresponding Source” for a work in object code form means all
|
||||
the source code needed to generate, install, and (for an executable
|
||||
work) run the object code and to modify the work, including scripts to
|
||||
control those activities. However, it does not include the work's
|
||||
System Libraries, or general-purpose tools or generally available free
|
||||
programs which are used unmodified in performing those activities but
|
||||
which are not part of the work. For example, Corresponding Source
|
||||
includes interface definition files associated with source files for
|
||||
the work, and the source code for shared libraries and dynamically
|
||||
linked subprograms that the work is specifically designed to require,
|
||||
such as by intimate data communication or control flow between those
|
||||
subprograms and other parts of the work.
|
||||
|
||||
The Corresponding Source need not include anything that users
|
||||
can regenerate automatically from other parts of the Corresponding
|
||||
Source.
|
||||
|
||||
The Corresponding Source for a work in source code form is that
|
||||
same work.
|
||||
|
||||
### 2. Basic Permissions
|
||||
|
||||
All rights granted under this License are granted for the term of
|
||||
copyright on the Program, and are irrevocable provided the stated
|
||||
conditions are met. This License explicitly affirms your unlimited
|
||||
permission to run the unmodified Program. The output from running a
|
||||
covered work is covered by this License only if the output, given its
|
||||
content, constitutes a covered work. This License acknowledges your
|
||||
rights of fair use or other equivalent, as provided by copyright law.
|
||||
|
||||
You may make, run and propagate covered works that you do not
|
||||
convey, without conditions so long as your license otherwise remains
|
||||
in force. You may convey covered works to others for the sole purpose
|
||||
of having them make modifications exclusively for you, or provide you
|
||||
with facilities for running those works, provided that you comply with
|
||||
the terms of this License in conveying all material for which you do
|
||||
not control copyright. Those thus making or running the covered works
|
||||
for you must do so exclusively on your behalf, under your direction
|
||||
and control, on terms that prohibit them from making any copies of
|
||||
your copyrighted material outside their relationship with you.
|
||||
|
||||
Conveying under any other circumstances is permitted solely under
|
||||
the conditions stated below. Sublicensing is not allowed; section 10
|
||||
makes it unnecessary.
|
||||
|
||||
### 3. Protecting Users' Legal Rights From Anti-Circumvention Law
|
||||
|
||||
No covered work shall be deemed part of an effective technological
|
||||
measure under any applicable law fulfilling obligations under article
|
||||
11 of the WIPO copyright treaty adopted on 20 December 1996, or
|
||||
similar laws prohibiting or restricting circumvention of such
|
||||
measures.
|
||||
|
||||
When you convey a covered work, you waive any legal power to forbid
|
||||
circumvention of technological measures to the extent such circumvention
|
||||
is effected by exercising rights under this License with respect to
|
||||
the covered work, and you disclaim any intention to limit operation or
|
||||
modification of the work as a means of enforcing, against the work's
|
||||
users, your or third parties' legal rights to forbid circumvention of
|
||||
technological measures.
|
||||
|
||||
### 4. Conveying Verbatim Copies
|
||||
|
||||
You may convey verbatim copies of the Program's source code as you
|
||||
receive it, in any medium, provided that you conspicuously and
|
||||
appropriately publish on each copy an appropriate copyright notice;
|
||||
keep intact all notices stating that this License and any
|
||||
non-permissive terms added in accord with section 7 apply to the code;
|
||||
keep intact all notices of the absence of any warranty; and give all
|
||||
recipients a copy of this License along with the Program.
|
||||
|
||||
You may charge any price or no price for each copy that you convey,
|
||||
and you may offer support or warranty protection for a fee.
|
||||
|
||||
### 5. Conveying Modified Source Versions
|
||||
|
||||
You may convey a work based on the Program, or the modifications to
|
||||
produce it from the Program, in the form of source code under the
|
||||
terms of section 4, provided that you also meet all of these conditions:
|
||||
|
||||
* **a)** The work must carry prominent notices stating that you modified
|
||||
it, and giving a relevant date.
|
||||
* **b)** The work must carry prominent notices stating that it is
|
||||
released under this License and any conditions added under section 7.
|
||||
This requirement modifies the requirement in section 4 to
|
||||
“keep intact all notices”.
|
||||
* **c)** You must license the entire work, as a whole, under this
|
||||
License to anyone who comes into possession of a copy. This
|
||||
License will therefore apply, along with any applicable section 7
|
||||
additional terms, to the whole of the work, and all its parts,
|
||||
regardless of how they are packaged. This License gives no
|
||||
permission to license the work in any other way, but it does not
|
||||
invalidate such permission if you have separately received it.
|
||||
* **d)** If the work has interactive user interfaces, each must display
|
||||
Appropriate Legal Notices; however, if the Program has interactive
|
||||
interfaces that do not display Appropriate Legal Notices, your
|
||||
work need not make them do so.
|
||||
|
||||
A compilation of a covered work with other separate and independent
|
||||
works, which are not by their nature extensions of the covered work,
|
||||
and which are not combined with it such as to form a larger program,
|
||||
in or on a volume of a storage or distribution medium, is called an
|
||||
“aggregate” if the compilation and its resulting copyright are not
|
||||
used to limit the access or legal rights of the compilation's users
|
||||
beyond what the individual works permit. Inclusion of a covered work
|
||||
in an aggregate does not cause this License to apply to the other
|
||||
parts of the aggregate.
|
||||
|
||||
### 6. Conveying Non-Source Forms
|
||||
|
||||
You may convey a covered work in object code form under the terms
|
||||
of sections 4 and 5, provided that you also convey the
|
||||
machine-readable Corresponding Source under the terms of this License,
|
||||
in one of these ways:
|
||||
|
||||
* **a)** Convey the object code in, or embodied in, a physical product
|
||||
(including a physical distribution medium), accompanied by the
|
||||
Corresponding Source fixed on a durable physical medium
|
||||
customarily used for software interchange.
|
||||
* **b)** Convey the object code in, or embodied in, a physical product
|
||||
(including a physical distribution medium), accompanied by a
|
||||
written offer, valid for at least three years and valid for as
|
||||
long as you offer spare parts or customer support for that product
|
||||
model, to give anyone who possesses the object code either **(1)** a
|
||||
copy of the Corresponding Source for all the software in the
|
||||
product that is covered by this License, on a durable physical
|
||||
medium customarily used for software interchange, for a price no
|
||||
more than your reasonable cost of physically performing this
|
||||
conveying of source, or **(2)** access to copy the
|
||||
Corresponding Source from a network server at no charge.
|
||||
* **c)** Convey individual copies of the object code with a copy of the
|
||||
written offer to provide the Corresponding Source. This
|
||||
alternative is allowed only occasionally and noncommercially, and
|
||||
only if you received the object code with such an offer, in accord
|
||||
with subsection 6b.
|
||||
* **d)** Convey the object code by offering access from a designated
|
||||
place (gratis or for a charge), and offer equivalent access to the
|
||||
Corresponding Source in the same way through the same place at no
|
||||
further charge. You need not require recipients to copy the
|
||||
Corresponding Source along with the object code. If the place to
|
||||
copy the object code is a network server, the Corresponding Source
|
||||
may be on a different server (operated by you or a third party)
|
||||
that supports equivalent copying facilities, provided you maintain
|
||||
clear directions next to the object code saying where to find the
|
||||
Corresponding Source. Regardless of what server hosts the
|
||||
Corresponding Source, you remain obligated to ensure that it is
|
||||
available for as long as needed to satisfy these requirements.
|
||||
* **e)** Convey the object code using peer-to-peer transmission, provided
|
||||
you inform other peers where the object code and Corresponding
|
||||
Source of the work are being offered to the general public at no
|
||||
charge under subsection 6d.
|
||||
|
||||
A separable portion of the object code, whose source code is excluded
|
||||
from the Corresponding Source as a System Library, need not be
|
||||
included in conveying the object code work.
|
||||
|
||||
A “User Product” is either **(1)** a “consumer product”, which means any
|
||||
tangible personal property which is normally used for personal, family,
|
||||
or household purposes, or **(2)** anything designed or sold for incorporation
|
||||
into a dwelling. In determining whether a product is a consumer product,
|
||||
doubtful cases shall be resolved in favor of coverage. For a particular
|
||||
product received by a particular user, “normally used” refers to a
|
||||
typical or common use of that class of product, regardless of the status
|
||||
of the particular user or of the way in which the particular user
|
||||
actually uses, or expects or is expected to use, the product. A product
|
||||
is a consumer product regardless of whether the product has substantial
|
||||
commercial, industrial or non-consumer uses, unless such uses represent
|
||||
the only significant mode of use of the product.
|
||||
|
||||
“Installation Information” for a User Product means any methods,
|
||||
procedures, authorization keys, or other information required to install
|
||||
and execute modified versions of a covered work in that User Product from
|
||||
a modified version of its Corresponding Source. The information must
|
||||
suffice to ensure that the continued functioning of the modified object
|
||||
code is in no case prevented or interfered with solely because
|
||||
modification has been made.
|
||||
|
||||
If you convey an object code work under this section in, or with, or
|
||||
specifically for use in, a User Product, and the conveying occurs as
|
||||
part of a transaction in which the right of possession and use of the
|
||||
User Product is transferred to the recipient in perpetuity or for a
|
||||
fixed term (regardless of how the transaction is characterized), the
|
||||
Corresponding Source conveyed under this section must be accompanied
|
||||
by the Installation Information. But this requirement does not apply
|
||||
if neither you nor any third party retains the ability to install
|
||||
modified object code on the User Product (for example, the work has
|
||||
been installed in ROM).
|
||||
|
||||
The requirement to provide Installation Information does not include a
|
||||
requirement to continue to provide support service, warranty, or updates
|
||||
for a work that has been modified or installed by the recipient, or for
|
||||
the User Product in which it has been modified or installed. Access to a
|
||||
network may be denied when the modification itself materially and
|
||||
adversely affects the operation of the network or violates the rules and
|
||||
protocols for communication across the network.
|
||||
|
||||
Corresponding Source conveyed, and Installation Information provided,
|
||||
in accord with this section must be in a format that is publicly
|
||||
documented (and with an implementation available to the public in
|
||||
source code form), and must require no special password or key for
|
||||
unpacking, reading or copying.
|
||||
|
||||
### 7. Additional Terms
|
||||
|
||||
“Additional permissions” are terms that supplement the terms of this
|
||||
License by making exceptions from one or more of its conditions.
|
||||
Additional permissions that are applicable to the entire Program shall
|
||||
be treated as though they were included in this License, to the extent
|
||||
that they are valid under applicable law. If additional permissions
|
||||
apply only to part of the Program, that part may be used separately
|
||||
under those permissions, but the entire Program remains governed by
|
||||
this License without regard to the additional permissions.
|
||||
|
||||
When you convey a copy of a covered work, you may at your option
|
||||
remove any additional permissions from that copy, or from any part of
|
||||
it. (Additional permissions may be written to require their own
|
||||
removal in certain cases when you modify the work.) You may place
|
||||
additional permissions on material, added by you to a covered work,
|
||||
for which you have or can give appropriate copyright permission.
|
||||
|
||||
Notwithstanding any other provision of this License, for material you
|
||||
add to a covered work, you may (if authorized by the copyright holders of
|
||||
that material) supplement the terms of this License with terms:
|
||||
|
||||
* **a)** Disclaiming warranty or limiting liability differently from the
|
||||
terms of sections 15 and 16 of this License; or
|
||||
* **b)** Requiring preservation of specified reasonable legal notices or
|
||||
author attributions in that material or in the Appropriate Legal
|
||||
Notices displayed by works containing it; or
|
||||
* **c)** Prohibiting misrepresentation of the origin of that material, or
|
||||
requiring that modified versions of such material be marked in
|
||||
reasonable ways as different from the original version; or
|
||||
* **d)** Limiting the use for publicity purposes of names of licensors or
|
||||
authors of the material; or
|
||||
* **e)** Declining to grant rights under trademark law for use of some
|
||||
trade names, trademarks, or service marks; or
|
||||
* **f)** Requiring indemnification of licensors and authors of that
|
||||
material by anyone who conveys the material (or modified versions of
|
||||
it) with contractual assumptions of liability to the recipient, for
|
||||
any liability that these contractual assumptions directly impose on
|
||||
those licensors and authors.
|
||||
|
||||
All other non-permissive additional terms are considered “further
|
||||
restrictions” within the meaning of section 10. If the Program as you
|
||||
received it, or any part of it, contains a notice stating that it is
|
||||
governed by this License along with a term that is a further
|
||||
restriction, you may remove that term. If a license document contains
|
||||
a further restriction but permits relicensing or conveying under this
|
||||
License, you may add to a covered work material governed by the terms
|
||||
of that license document, provided that the further restriction does
|
||||
not survive such relicensing or conveying.
|
||||
|
||||
If you add terms to a covered work in accord with this section, you
|
||||
must place, in the relevant source files, a statement of the
|
||||
additional terms that apply to those files, or a notice indicating
|
||||
where to find the applicable terms.
|
||||
|
||||
Additional terms, permissive or non-permissive, may be stated in the
|
||||
form of a separately written license, or stated as exceptions;
|
||||
the above requirements apply either way.
|
||||
|
||||
### 8. Termination
|
||||
|
||||
You may not propagate or modify a covered work except as expressly
|
||||
provided under this License. Any attempt otherwise to propagate or
|
||||
modify it is void, and will automatically terminate your rights under
|
||||
this License (including any patent licenses granted under the third
|
||||
paragraph of section 11).
|
||||
|
||||
However, if you cease all violation of this License, then your
|
||||
license from a particular copyright holder is reinstated **(a)**
|
||||
provisionally, unless and until the copyright holder explicitly and
|
||||
finally terminates your license, and **(b)** permanently, if the copyright
|
||||
holder fails to notify you of the violation by some reasonable means
|
||||
prior to 60 days after the cessation.
|
||||
|
||||
Moreover, your license from a particular copyright holder is
|
||||
reinstated permanently if the copyright holder notifies you of the
|
||||
violation by some reasonable means, this is the first time you have
|
||||
received notice of violation of this License (for any work) from that
|
||||
copyright holder, and you cure the violation prior to 30 days after
|
||||
your receipt of the notice.
|
||||
|
||||
Termination of your rights under this section does not terminate the
|
||||
licenses of parties who have received copies or rights from you under
|
||||
this License. If your rights have been terminated and not permanently
|
||||
reinstated, you do not qualify to receive new licenses for the same
|
||||
material under section 10.
|
||||
|
||||
### 9. Acceptance Not Required for Having Copies
|
||||
|
||||
You are not required to accept this License in order to receive or
|
||||
run a copy of the Program. Ancillary propagation of a covered work
|
||||
occurring solely as a consequence of using peer-to-peer transmission
|
||||
to receive a copy likewise does not require acceptance. However,
|
||||
nothing other than this License grants you permission to propagate or
|
||||
modify any covered work. These actions infringe copyright if you do
|
||||
not accept this License. Therefore, by modifying or propagating a
|
||||
covered work, you indicate your acceptance of this License to do so.
|
||||
|
||||
### 10. Automatic Licensing of Downstream Recipients
|
||||
|
||||
Each time you convey a covered work, the recipient automatically
|
||||
receives a license from the original licensors, to run, modify and
|
||||
propagate that work, subject to this License. You are not responsible
|
||||
for enforcing compliance by third parties with this License.
|
||||
|
||||
An “entity transaction” is a transaction transferring control of an
|
||||
organization, or substantially all assets of one, or subdividing an
|
||||
organization, or merging organizations. If propagation of a covered
|
||||
work results from an entity transaction, each party to that
|
||||
transaction who receives a copy of the work also receives whatever
|
||||
licenses to the work the party's predecessor in interest had or could
|
||||
give under the previous paragraph, plus a right to possession of the
|
||||
Corresponding Source of the work from the predecessor in interest, if
|
||||
the predecessor has it or can get it with reasonable efforts.
|
||||
|
||||
You may not impose any further restrictions on the exercise of the
|
||||
rights granted or affirmed under this License. For example, you may
|
||||
not impose a license fee, royalty, or other charge for exercise of
|
||||
rights granted under this License, and you may not initiate litigation
|
||||
(including a cross-claim or counterclaim in a lawsuit) alleging that
|
||||
any patent claim is infringed by making, using, selling, offering for
|
||||
sale, or importing the Program or any portion of it.
|
||||
|
||||
### 11. Patents
|
||||
|
||||
A “contributor” is a copyright holder who authorizes use under this
|
||||
License of the Program or a work on which the Program is based. The
|
||||
work thus licensed is called the contributor's “contributor version”.
|
||||
|
||||
A contributor's “essential patent claims” are all patent claims
|
||||
owned or controlled by the contributor, whether already acquired or
|
||||
hereafter acquired, that would be infringed by some manner, permitted
|
||||
by this License, of making, using, or selling its contributor version,
|
||||
but do not include claims that would be infringed only as a
|
||||
consequence of further modification of the contributor version. For
|
||||
purposes of this definition, “control” includes the right to grant
|
||||
patent sublicenses in a manner consistent with the requirements of
|
||||
this License.
|
||||
|
||||
Each contributor grants you a non-exclusive, worldwide, royalty-free
|
||||
patent license under the contributor's essential patent claims, to
|
||||
make, use, sell, offer for sale, import and otherwise run, modify and
|
||||
propagate the contents of its contributor version.
|
||||
|
||||
In the following three paragraphs, a “patent license” is any express
|
||||
agreement or commitment, however denominated, not to enforce a patent
|
||||
(such as an express permission to practice a patent or covenant not to
|
||||
sue for patent infringement). To “grant” such a patent license to a
|
||||
party means to make such an agreement or commitment not to enforce a
|
||||
patent against the party.
|
||||
|
||||
If you convey a covered work, knowingly relying on a patent license,
|
||||
and the Corresponding Source of the work is not available for anyone
|
||||
to copy, free of charge and under the terms of this License, through a
|
||||
publicly available network server or other readily accessible means,
|
||||
then you must either **(1)** cause the Corresponding Source to be so
|
||||
available, or **(2)** arrange to deprive yourself of the benefit of the
|
||||
patent license for this particular work, or **(3)** arrange, in a manner
|
||||
consistent with the requirements of this License, to extend the patent
|
||||
license to downstream recipients. “Knowingly relying” means you have
|
||||
actual knowledge that, but for the patent license, your conveying the
|
||||
covered work in a country, or your recipient's use of the covered work
|
||||
in a country, would infringe one or more identifiable patents in that
|
||||
country that you have reason to believe are valid.
|
||||
|
||||
If, pursuant to or in connection with a single transaction or
|
||||
arrangement, you convey, or propagate by procuring conveyance of, a
|
||||
covered work, and grant a patent license to some of the parties
|
||||
receiving the covered work authorizing them to use, propagate, modify
|
||||
or convey a specific copy of the covered work, then the patent license
|
||||
you grant is automatically extended to all recipients of the covered
|
||||
work and works based on it.
|
||||
|
||||
A patent license is “discriminatory” if it does not include within
|
||||
the scope of its coverage, prohibits the exercise of, or is
|
||||
conditioned on the non-exercise of one or more of the rights that are
|
||||
specifically granted under this License. You may not convey a covered
|
||||
work if you are a party to an arrangement with a third party that is
|
||||
in the business of distributing software, under which you make payment
|
||||
to the third party based on the extent of your activity of conveying
|
||||
the work, and under which the third party grants, to any of the
|
||||
parties who would receive the covered work from you, a discriminatory
|
||||
patent license **(a)** in connection with copies of the covered work
|
||||
conveyed by you (or copies made from those copies), or **(b)** primarily
|
||||
for and in connection with specific products or compilations that
|
||||
contain the covered work, unless you entered into that arrangement,
|
||||
or that patent license was granted, prior to 28 March 2007.
|
||||
|
||||
Nothing in this License shall be construed as excluding or limiting
|
||||
any implied license or other defenses to infringement that may
|
||||
otherwise be available to you under applicable patent law.
|
||||
|
||||
### 12. No Surrender of Others' Freedom
|
||||
|
||||
If conditions are imposed on you (whether by court order, agreement or
|
||||
otherwise) that contradict the conditions of this License, they do not
|
||||
excuse you from the conditions of this License. If you cannot convey a
|
||||
covered work so as to satisfy simultaneously your obligations under this
|
||||
License and any other pertinent obligations, then as a consequence you may
|
||||
not convey it at all. For example, if you agree to terms that obligate you
|
||||
to collect a royalty for further conveying from those to whom you convey
|
||||
the Program, the only way you could satisfy both those terms and this
|
||||
License would be to refrain entirely from conveying the Program.
|
||||
|
||||
### 13. Remote Network Interaction; Use with the GNU General Public License
|
||||
|
||||
Notwithstanding any other provision of this License, if you modify the
|
||||
Program, your modified version must prominently offer all users
|
||||
interacting with it remotely through a computer network (if your version
|
||||
supports such interaction) an opportunity to receive the Corresponding
|
||||
Source of your version by providing access to the Corresponding Source
|
||||
from a network server at no charge, through some standard or customary
|
||||
means of facilitating copying of software. This Corresponding Source
|
||||
shall include the Corresponding Source for any work covered by version 3
|
||||
of the GNU General Public License that is incorporated pursuant to the
|
||||
following paragraph.
|
||||
|
||||
Notwithstanding any other provision of this License, you have
|
||||
permission to link or combine any covered work with a work licensed
|
||||
under version 3 of the GNU General Public License into a single
|
||||
combined work, and to convey the resulting work. The terms of this
|
||||
License will continue to apply to the part which is the covered work,
|
||||
but the work with which it is combined will remain governed by version
|
||||
3 of the GNU General Public License.
|
||||
|
||||
### 14. Revised Versions of this License
|
||||
|
||||
The Free Software Foundation may publish revised and/or new versions of
|
||||
the GNU Affero General Public License from time to time. Such new versions
|
||||
will be similar in spirit to the present version, but may differ in detail to
|
||||
address new problems or concerns.
|
||||
|
||||
Each version is given a distinguishing version number. If the
|
||||
Program specifies that a certain numbered version of the GNU Affero General
|
||||
Public License “or any later version” applies to it, you have the
|
||||
option of following the terms and conditions either of that numbered
|
||||
version or of any later version published by the Free Software
|
||||
Foundation. If the Program does not specify a version number of the
|
||||
GNU Affero General Public License, you may choose any version ever published
|
||||
by the Free Software Foundation.
|
||||
|
||||
If the Program specifies that a proxy can decide which future
|
||||
versions of the GNU Affero General Public License can be used, that proxy's
|
||||
public statement of acceptance of a version permanently authorizes you
|
||||
to choose that version for the Program.
|
||||
|
||||
Later license versions may give you additional or different
|
||||
permissions. However, no additional obligations are imposed on any
|
||||
author or copyright holder as a result of your choosing to follow a
|
||||
later version.
|
||||
|
||||
### 15. Disclaimer of Warranty
|
||||
|
||||
THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY
|
||||
APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT
|
||||
HOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM “AS IS” WITHOUT WARRANTY
|
||||
OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO,
|
||||
THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
|
||||
PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAM
|
||||
IS WITH YOU. SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF
|
||||
ALL NECESSARY SERVICING, REPAIR OR CORRECTION.
|
||||
|
||||
### 16. Limitation of Liability
|
||||
|
||||
IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING
|
||||
WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MODIFIES AND/OR CONVEYS
|
||||
THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY
|
||||
GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE
|
||||
USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED TO LOSS OF
|
||||
DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD
|
||||
PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER PROGRAMS),
|
||||
EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF
|
||||
SUCH DAMAGES.
|
||||
|
||||
### 17. Interpretation of Sections 15 and 16
|
||||
|
||||
If the disclaimer of warranty and limitation of liability provided
|
||||
above cannot be given local legal effect according to their terms,
|
||||
reviewing courts shall apply local law that most closely approximates
|
||||
an absolute waiver of all civil liability in connection with the
|
||||
Program, unless a warranty or assumption of liability accompanies a
|
||||
copy of the Program in return for a fee.
|
||||
|
||||
_END OF TERMS AND CONDITIONS_
|
||||
|
||||
## How to Apply These Terms to Your New Programs
|
||||
|
||||
If you develop a new program, and you want it to be of the greatest
|
||||
possible use to the public, the best way to achieve this is to make it
|
||||
free software which everyone can redistribute and change under these terms.
|
||||
|
||||
To do so, attach the following notices to the program. It is safest
|
||||
to attach them to the start of each source file to most effectively
|
||||
state the exclusion of warranty; and each file should have at least
|
||||
the “copyright” line and a pointer to where the full notice is found.
|
||||
|
||||
<one line to give the program's name and a brief idea of what it does.>
|
||||
Copyright (C) <year> <name of author>
|
||||
|
||||
This program is free software: you can redistribute it and/or modify
|
||||
it under the terms of the GNU Affero General Public License as published by
|
||||
the Free Software Foundation, either version 3 of the License, or
|
||||
(at your option) any later version.
|
||||
|
||||
This program is distributed in the hope that it will be useful,
|
||||
but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
GNU Affero General Public License for more details.
|
||||
|
||||
You should have received a copy of the GNU Affero General Public License
|
||||
along with this program. If not, see <http://www.gnu.org/licenses/>.
|
||||
|
||||
Also add information on how to contact you by electronic and paper mail.
|
||||
|
||||
If your software can interact with users remotely through a computer
|
||||
network, you should also make sure that it provides a way for users to
|
||||
get its source. For example, if your program is a web application, its
|
||||
interface could display a “Source” link that leads users to an archive
|
||||
of the code. There are many ways you could offer source, and different
|
||||
solutions will be better for different programs; see section 13 for the
|
||||
specific requirements.
|
||||
|
||||
You should also get your employer (if you work as a programmer) or school,
|
||||
if any, to sign a “copyright disclaimer” for the program, if necessary.
|
||||
For more information on this, and how to apply and follow the GNU AGPL, see
|
||||
<http://www.gnu.org/licenses/>
|
48
README.md
Normal file
48
README.md
Normal file
|
@ -0,0 +1,48 @@
|
|||
# NML -- Not a markup language!
|
||||
|
||||
Currently a work in progress, expect features and fixes to arrive soon!
|
||||
|
||||
# Requirements
|
||||
|
||||
Some features requires external dependencies to work.
|
||||
|
||||
## LaTeX rendering for HTML
|
||||
|
||||
We ship a modified version of `latex2svg` by Matthias C. Hormann.
|
||||
The modified program can be found in [third/latex2svg](third/latex2svg) and is licensed under MIT.
|
||||
|
||||
The installation instructions specified on [latex2svg's repository](https://github.com/Moonbase59/latex2svg).
|
||||
|
||||
## Graphviz rendering
|
||||
|
||||
To render Graphviz graph `[graph]...[/graph]`
|
||||
You need to install the `dot` program from [Graphviz](https://graphviz.org/) in order to render graphs.
|
||||
|
||||
## Lua kernels
|
||||
|
||||
To execute Lua kernels you need to install `liblua` version 5.4.
|
||||
Support for a statically linked Lua may be added in the future.
|
||||
|
||||
# Compiling
|
||||
|
||||
```
|
||||
cargo build --release --bin nml
|
||||
```
|
||||
|
||||
# Features roadmap
|
||||
|
||||
- [x] Paragraphs
|
||||
- [x] LaTeX rendering
|
||||
- [x] Graphviz rendering
|
||||
- [x] Media
|
||||
- [ ] References
|
||||
- [ ] Complete Lua api
|
||||
- [ ] Documentation
|
||||
- [ ] Table
|
||||
- [ ] LaTeX output
|
||||
- [ ] LSP
|
||||
|
||||
# License
|
||||
|
||||
NML is licensed under the GNU AGPL version 3 or later. See [LICENSE.md](LICENSE.md) for more information.
|
||||
License for third-party dependencies can be accessed via `cargo license`
|
48
readme.nml
48
readme.nml
|
@ -3,7 +3,7 @@
|
|||
@'html.css = style.css
|
||||
|
||||
@tex.main.fontsize = 9
|
||||
@tex.main.preamble = \usepackage{xcolor} \\
|
||||
@tex.main.preamble = \usepackage{xcolor, amsmath} \\
|
||||
\definecolor{__color1}{HTML}{d5d5d5} \\
|
||||
\everymath{\color{__color1}\displaystyle}
|
||||
@tex.main.block_prepend = \color{__color1}
|
||||
|
@ -77,10 +77,52 @@ end
|
|||
|
||||
Evaluating `!` from a kernel: `\%<![kernel] eval>\%`
|
||||
|
||||
* `\%<![example] make_bold("Hello, World!")>\%` → %<![example] make_bold("Hello, World!")>%
|
||||
* `\%<[example]! make_bold("Hello, World!")>\%` → %<[example]! make_bold("Hello, World!")>%
|
||||
|
||||
# Latex
|
||||
|
||||
# Support for inline maths:
|
||||
## Support for inline maths:
|
||||
* $\sum^{\infty}_{k=1} \frac{1}{k^2} = \frac{\pi^2}{6}$
|
||||
* $n! = \int_0^\infty t^n e^{-t} \text{ d}t$
|
||||
|
||||
# Graphs
|
||||
|
||||
NML adds support for *Graphviz* graphs.
|
||||
|
||||
[graph][
|
||||
width=600px,
|
||||
layout=neato
|
||||
]
|
||||
digraph g {
|
||||
bgcolor="transparent"
|
||||
fontname="Helvetica,Arial,sans-serif"
|
||||
node [fontname="Helvetica,Arial,sans-serif"]
|
||||
edge [fontname="Helvetica,Arial,sans-serif"]
|
||||
graph [fontsize=30 labelloc="t" label="" splines=true overlap=false rankdir = "LR"];
|
||||
"state0" [ style = "filled, bold" penwidth = 5 fillcolor = "white" fontname = "Courier New" shape = "Mrecord" label =<<table border="0" cellborder="0" cellpadding="3" bgcolor="white"><tr><td bgcolor="black" align="center" colspan="2"><font color="white">State #0</font></td></tr><tr><td align="left" port="r0">(0) s -> •e $ </td></tr><tr><td align="left" port="r1">(1) e -> •l '=' r </td></tr><tr><td align="left" port="r2">(2) e -> •r </td></tr><tr><td align="left" port="r3">(3) l -> •'*' r </td></tr><tr><td align="left" port="r4">(4) l -> •'n' </td></tr><tr><td align="left" port="r5">(5) r -> •l </td></tr></table>> ];
|
||||
"state1" [ style = "filled" penwidth = 1 fillcolor = "white" fontname = "Courier New" shape = "Mrecord" label =<<table border="0" cellborder="0" cellpadding="3" bgcolor="white"><tr><td bgcolor="black" align="center" colspan="2"><font color="white">State #1</font></td></tr><tr><td align="left" port="r3">(3) l -> •'*' r </td></tr><tr><td align="left" port="r3">(3) l -> '*' •r </td></tr><tr><td align="left" port="r4">(4) l -> •'n' </td></tr><tr><td align="left" port="r5">(5) r -> •l </td></tr></table>> ];
|
||||
"state2" [ style = "filled" penwidth = 1 fillcolor = "white" fontname = "Courier New" shape = "Mrecord" label =<<table border="0" cellborder="0" cellpadding="3" bgcolor="white"><tr><td bgcolor="black" align="center" colspan="2"><font color="white">State #2</font></td></tr><tr><td align="left" port="r4">(4) l -> 'n' •</td><td bgcolor="grey" align="right">=$</td></tr></table>> ];
|
||||
"state3" [ style = "filled" penwidth = 1 fillcolor = "white" fontname = "Courier New" shape = "Mrecord" label =<<table border="0" cellborder="0" cellpadding="3" bgcolor="white"><tr><td bgcolor="black" align="center" colspan="2"><font color="white">State #3</font></td></tr><tr><td align="left" port="r5">(5) r -> l •</td><td bgcolor="grey" align="right">=$</td></tr></table>> ];
|
||||
"state4" [ style = "filled" penwidth = 1 fillcolor = "white" fontname = "Courier New" shape = "Mrecord" label =<<table border="0" cellborder="0" cellpadding="3" bgcolor="white"><tr><td bgcolor="black" align="center" colspan="2"><font color="white">State #4</font></td></tr><tr><td align="left" port="r3">(3) l -> '*' r •</td><td bgcolor="grey" align="right">=$</td></tr></table>> ];
|
||||
"state5" [ style = "filled" penwidth = 1 fillcolor = "black" fontname = "Courier New" shape = "Mrecord" label =<<table border="0" cellborder="0" cellpadding="3" bgcolor="black"><tr><td bgcolor="black" align="center" colspan="2"><font color="white">State #5</font></td></tr><tr><td align="left" port="r0"><font color="white">(0) s -> e •$ </font></td></tr></table>> ];
|
||||
"state6" [ style = "filled" penwidth = 1 fillcolor = "white" fontname = "Courier New" shape = "Mrecord" label =<<table border="0" cellborder="0" cellpadding="3" bgcolor="white"><tr><td bgcolor="black" align="center" colspan="2"><font color="white">State #6</font></td></tr><tr><td align="left" port="r1">(1) e -> l •'=' r </td></tr><tr><td align="left" port="r5">(5) r -> l •</td><td bgcolor="grey" align="right">$</td></tr></table>> ];
|
||||
"state7" [ style = "filled" penwidth = 1 fillcolor = "white" fontname = "Courier New" shape = "Mrecord" label =<<table border="0" cellborder="0" cellpadding="3" bgcolor="white"><tr><td bgcolor="black" align="center" colspan="2"><font color="white">State #7</font></td></tr><tr><td align="left" port="r1">(1) e -> l '=' •r </td></tr><tr><td align="left" port="r3">(3) l -> •'*' r </td></tr><tr><td align="left" port="r4">(4) l -> •'n' </td></tr><tr><td align="left" port="r5">(5) r -> •l </td></tr></table>> ];
|
||||
"state8" [ style = "filled" penwidth = 1 fillcolor = "white" fontname = "Courier New" shape = "Mrecord" label =<<table border="0" cellborder="0" cellpadding="3" bgcolor="white"><tr><td bgcolor="black" align="center" colspan="2"><font color="white">State #8</font></td></tr><tr><td align="left" port="r1">(1) e -> l '=' r •</td><td bgcolor="grey" align="right">$</td></tr></table>> ];
|
||||
"state9" [ style = "filled" penwidth = 1 fillcolor = "white" fontname = "Courier New" shape = "Mrecord" label =<<table border="0" cellborder="0" cellpadding="3" bgcolor="white"><tr><td bgcolor="black" align="center" colspan="2"><font color="white">State #9</font></td></tr><tr><td align="left" port="r2">(2) e -> r •</td><td bgcolor="grey" align="right">$</td></tr></table>> ];
|
||||
state0 -> state5 [ penwidth = 5 fontsize = 28 fontcolor = "black" label = "e" ];
|
||||
state0 -> state6 [ penwidth = 5 fontsize = 28 fontcolor = "black" label = "l" ];
|
||||
state0 -> state9 [ penwidth = 5 fontsize = 28 fontcolor = "black" label = "r" ];
|
||||
state0 -> state1 [ penwidth = 1 fontsize = 14 fontcolor = "grey28" label = "'*'" ];
|
||||
state0 -> state2 [ penwidth = 1 fontsize = 14 fontcolor = "grey28" label = "'n'" ];
|
||||
state1 -> state1 [ penwidth = 1 fontsize = 14 fontcolor = "grey28" label = "'*'" ];
|
||||
state1 -> state4 [ penwidth = 5 fontsize = 28 fontcolor = "black" label = "r" ];
|
||||
state1 -> state2 [ penwidth = 1 fontsize = 14 fontcolor = "grey28" label = "'n'" ];
|
||||
state1 -> state3 [ penwidth = 5 fontsize = 28 fontcolor = "black" label = "l" ];
|
||||
state6 -> state7 [ penwidth = 1 fontsize = 14 fontcolor = "grey28" label = "'='" ];
|
||||
state7 -> state8 [ penwidth = 5 fontsize = 28 fontcolor = "black" label = "r" ];
|
||||
state7 -> state1 [ penwidth = 1 fontsize = 14 fontcolor = "grey28" label = "'*'" ];
|
||||
state7 -> state2 [ penwidth = 1 fontsize = 14 fontcolor = "grey28" label = "'n'" ];
|
||||
state7 -> state3 [ penwidth = 5 fontsize = 28 fontcolor = "black" label = "l" ];
|
||||
}
|
||||
[/graph]
|
||||
|
||||
|
|
77
src/cache/cache.rs
vendored
77
src/cache/cache.rs
vendored
|
@ -1,29 +1,13 @@
|
|||
use std::{error::Error, path::PathBuf};
|
||||
use rusqlite::types::FromSql;
|
||||
use rusqlite::Connection;
|
||||
use rusqlite::ToSql;
|
||||
|
||||
use rusqlite::{types::FromSql, Connection, Params, ToSql};
|
||||
|
||||
struct Cache {
|
||||
con: Connection
|
||||
}
|
||||
|
||||
impl Cache {
|
||||
fn new(file: PathBuf) -> Result<Self, String> {
|
||||
match Connection::open(file)
|
||||
{
|
||||
Err(e) => return Err(format!("Could not connect to cache database: {}", e.to_string())),
|
||||
Ok(con) => Ok(Self { con })
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
pub enum CachedError<E>
|
||||
{
|
||||
pub enum CachedError<E> {
|
||||
SqlErr(rusqlite::Error),
|
||||
GenErr(E)
|
||||
GenErr(E),
|
||||
}
|
||||
|
||||
pub trait Cached
|
||||
{
|
||||
pub trait Cached {
|
||||
type Key;
|
||||
type Value;
|
||||
|
||||
|
@ -39,10 +23,8 @@ pub trait Cached
|
|||
|
||||
fn key(&self) -> <Self as Cached>::Key;
|
||||
|
||||
fn init(con: &mut Connection) -> Result<(), rusqlite::Error>
|
||||
{
|
||||
con.execute(<Self as Cached>::sql_table(), ())
|
||||
.map(|_| ())
|
||||
fn init(con: &mut Connection) -> Result<(), rusqlite::Error> {
|
||||
con.execute(<Self as Cached>::sql_table(), ()).map(|_| ())
|
||||
}
|
||||
|
||||
/// Attempts to retrieve a cached element from the compilation database
|
||||
|
@ -54,8 +36,11 @@ pub trait Cached
|
|||
/// or if not cached, an error from the generator [`f`]
|
||||
///
|
||||
/// Note that on error, [`f`] may still have been called
|
||||
fn cached<E, F>(&self, con: &mut Connection, f: F)
|
||||
-> Result<<Self as Cached>::Value, CachedError<E>>
|
||||
fn cached<E, F>(
|
||||
&self,
|
||||
con: &mut Connection,
|
||||
f: F,
|
||||
) -> Result<<Self as Cached>::Value, CachedError<E>>
|
||||
where
|
||||
<Self as Cached>::Key: ToSql,
|
||||
<Self as Cached>::Value: FromSql + ToSql,
|
||||
|
@ -64,42 +49,36 @@ pub trait Cached
|
|||
let key = self.key();
|
||||
|
||||
// Find in cache
|
||||
let mut query = match con.prepare(<Self as Cached>::sql_get_query())
|
||||
{
|
||||
let mut query = match con.prepare(<Self as Cached>::sql_get_query()) {
|
||||
Ok(query) => query,
|
||||
Err(e) => return Err(CachedError::SqlErr(e))
|
||||
Err(e) => return Err(CachedError::SqlErr(e)),
|
||||
};
|
||||
|
||||
let value = query.query_row([&key], |row|
|
||||
{
|
||||
let value = query
|
||||
.query_row([&key], |row| {
|
||||
Ok(row.get_unwrap::<_, <Self as Cached>::Value>(0))
|
||||
}).ok();
|
||||
})
|
||||
.ok();
|
||||
|
||||
if let Some(value) = value
|
||||
{
|
||||
if let Some(value) = value {
|
||||
// Found in cache
|
||||
return Ok(value)
|
||||
}
|
||||
else
|
||||
{
|
||||
return Ok(value);
|
||||
} else {
|
||||
// Compute a value
|
||||
let value = match f(&self)
|
||||
{
|
||||
let value = match f(&self) {
|
||||
Ok(val) => val,
|
||||
Err(e) => return Err(CachedError::GenErr(e))
|
||||
Err(e) => return Err(CachedError::GenErr(e)),
|
||||
};
|
||||
|
||||
// Try to insert
|
||||
let mut query = match con.prepare(<Self as Cached>::sql_insert_query())
|
||||
{
|
||||
let mut query = match con.prepare(<Self as Cached>::sql_insert_query()) {
|
||||
Ok(query) => query,
|
||||
Err(e) => return Err(CachedError::SqlErr(e))
|
||||
Err(e) => return Err(CachedError::SqlErr(e)),
|
||||
};
|
||||
|
||||
match query.execute((&key, &value))
|
||||
{
|
||||
match query.execute((&key, &value)) {
|
||||
Ok(_) => Ok(value),
|
||||
Err(e) => Err(CachedError::SqlErr(e))
|
||||
Err(e) => Err(CachedError::SqlErr(e)),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
|
|
@ -62,12 +62,12 @@ impl Compiler
|
|||
}
|
||||
}
|
||||
|
||||
pub fn header(&self, document: &Document) -> String
|
||||
pub fn header(&self, document: &dyn Document) -> String
|
||||
{
|
||||
pub fn get_variable_or_error(document: &Document, var_name: &'static str) -> Option<Rc<dyn Variable>>
|
||||
pub fn get_variable_or_error(document: &dyn Document, var_name: &'static str) -> Option<Rc<dyn Variable>>
|
||||
{
|
||||
document.get_variable(var_name)
|
||||
.and_then(|(_, var)| Some(var))
|
||||
.and_then(|var| Some(var))
|
||||
.or_else(|| {
|
||||
println!("Missing variable `{var_name}` in {}", document.source().name());
|
||||
None
|
||||
|
@ -85,7 +85,7 @@ impl Compiler
|
|||
result += format!("<title>{}</title>", self.sanitize(page_title.to_string())).as_str();
|
||||
}
|
||||
|
||||
if let Some((_, css)) = document.get_variable("html.css")
|
||||
if let Some(css) = document.get_variable("html.css")
|
||||
{
|
||||
result += format!("<link rel=\"stylesheet\" href=\"{}\">", self.sanitize(css.to_string())).as_str();
|
||||
}
|
||||
|
@ -101,7 +101,7 @@ impl Compiler
|
|||
result
|
||||
}
|
||||
|
||||
pub fn footer(&self, _document: &Document) -> String
|
||||
pub fn footer(&self, _document: &dyn Document) -> String
|
||||
{
|
||||
let mut result = String::new();
|
||||
match self.target()
|
||||
|
@ -116,10 +116,10 @@ impl Compiler
|
|||
result
|
||||
}
|
||||
|
||||
pub fn compile(&self, document: &Document) -> String
|
||||
pub fn compile(&self, document: &dyn Document) -> String
|
||||
{
|
||||
let mut out = String::new();
|
||||
let borrow = document.content.borrow();
|
||||
let borrow = document.content().borrow();
|
||||
|
||||
// Header
|
||||
out += self.header(document).as_str();
|
||||
|
|
|
@ -1,4 +1,6 @@
|
|||
use std::cell::{Ref, RefCell, RefMut};
|
||||
use std::cell::Ref;
|
||||
use std::cell::RefCell;
|
||||
use std::cell::RefMut;
|
||||
use std::collections::hash_map::HashMap;
|
||||
use std::rc::Rc;
|
||||
|
||||
|
@ -7,7 +9,9 @@ use crate::parser::source::Source;
|
|||
use super::element::Element;
|
||||
use super::variable::Variable;
|
||||
|
||||
|
||||
// TODO: Referenceable rework
|
||||
// Usize based referencing is not an acceptable method
|
||||
// if we want to support deltas for the lsp
|
||||
#[derive(Debug)]
|
||||
pub struct Scope {
|
||||
/// List of all referenceable elements in current scope.
|
||||
|
@ -17,168 +21,137 @@ pub struct Scope {
|
|||
}
|
||||
|
||||
impl Scope {
|
||||
fn new() -> Self {
|
||||
pub fn new() -> Self {
|
||||
Self {
|
||||
referenceable: HashMap::new(),
|
||||
variables: HashMap::new(),
|
||||
}
|
||||
}
|
||||
|
||||
pub fn merge(&mut self, other: &mut Scope, merge_as: &String, ref_offset: usize)
|
||||
{
|
||||
match merge_as.is_empty()
|
||||
{
|
||||
pub fn merge(&mut self, other: &mut Scope, merge_as: &String, ref_offset: usize) {
|
||||
match merge_as.is_empty() {
|
||||
true => {
|
||||
// References
|
||||
self.referenceable.extend(other.referenceable.drain()
|
||||
.map(|(name, idx)|
|
||||
(name, idx+ref_offset)));
|
||||
self.referenceable.extend(
|
||||
other
|
||||
.referenceable
|
||||
.drain()
|
||||
.map(|(name, idx)| (name, idx + ref_offset)),
|
||||
);
|
||||
|
||||
// Variables
|
||||
self.variables.extend(other.variables.drain()
|
||||
.map(|(name, var)|
|
||||
(name, var)));
|
||||
},
|
||||
self.variables
|
||||
.extend(other.variables.drain().map(|(name, var)| (name, var)));
|
||||
}
|
||||
false => {
|
||||
// References
|
||||
self.referenceable.extend(other.referenceable.drain()
|
||||
.map(|(name, idx)|
|
||||
(format!("{merge_as}.{name}"), idx+ref_offset)));
|
||||
self.referenceable.extend(
|
||||
other
|
||||
.referenceable
|
||||
.drain()
|
||||
.map(|(name, idx)| (format!("{merge_as}.{name}"), idx + ref_offset)),
|
||||
);
|
||||
|
||||
// Variables
|
||||
self.variables.extend(other.variables.drain()
|
||||
.map(|(name, var)|
|
||||
(format!("{merge_as}.{name}"), var)));
|
||||
self.variables.extend(
|
||||
other
|
||||
.variables
|
||||
.drain()
|
||||
.map(|(name, var)| (format!("{merge_as}.{name}"), var)),
|
||||
);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Debug)]
|
||||
pub struct Document<'a> {
|
||||
source: Rc<dyn Source>,
|
||||
parent: Option<&'a Document<'a>>, /// Document's parent
|
||||
pub trait Document<'a>: core::fmt::Debug {
|
||||
/// Gets the document [`Source`]
|
||||
fn source(&self) -> Rc<dyn Source>;
|
||||
|
||||
// FIXME: Render these fields private
|
||||
pub content: RefCell<Vec<Box<dyn Element>>>,
|
||||
pub scope: RefCell<Scope>,
|
||||
}
|
||||
/// Gets the document parent (if it exists)
|
||||
fn parent(&self) -> Option<&'a dyn Document<'a>>;
|
||||
|
||||
impl<'a> Document<'a> {
|
||||
pub fn new(source: Rc<dyn Source>, parent: Option<&'a Document<'a>>) -> Self
|
||||
{
|
||||
Self {
|
||||
source: source,
|
||||
parent: parent,
|
||||
content: RefCell::new(Vec::new()),
|
||||
scope: RefCell::new(Scope::new()),
|
||||
}
|
||||
/// Gets the document content
|
||||
/// The content is essentially the AST for the document
|
||||
fn content(&self) -> &RefCell<Vec<Box<dyn Element>>>;
|
||||
|
||||
/// Gets the document [`Scope`]
|
||||
fn scope(&self) -> &RefCell<Scope>;
|
||||
|
||||
/// Pushes a new element into the document's content
|
||||
fn push(&self, elem: Box<dyn Element>) {
|
||||
// TODO: RefTable
|
||||
|
||||
self.content().borrow_mut().push(elem);
|
||||
}
|
||||
|
||||
pub fn source(&self) -> Rc<dyn Source> { self.source.clone() }
|
||||
|
||||
pub fn parent(&self) -> Option<&Document> { self.parent }
|
||||
|
||||
/// Push an element [`elem`] to content. [`in_paragraph`] is true if a paragraph is active
|
||||
pub fn push(&self, elem: Box<dyn Element>)
|
||||
/*
|
||||
fn last_element(&'a self, recurse: bool) -> Option<Ref<'_, dyn Element>>
|
||||
{
|
||||
// Add index of current element to scope's reference table
|
||||
if let Some(referenceable) = elem.as_referenceable()
|
||||
{
|
||||
// Only add if referenceable holds a reference
|
||||
if let Some(ref_name) = referenceable.reference_name()
|
||||
{
|
||||
self.scope.borrow_mut().referenceable.insert(ref_name.clone(), self.content.borrow().len());
|
||||
}
|
||||
}
|
||||
|
||||
self.content.borrow_mut().push(elem);
|
||||
}
|
||||
|
||||
pub fn last_element<T: Element>(&self, recurse: bool) -> Option<Ref<'_, T>>
|
||||
{
|
||||
let elem = Ref::filter_map(self.content.borrow(),
|
||||
let elem = Ref::filter_map(self.content().borrow(),
|
||||
|content| content.last()
|
||||
.and_then(|last| last.downcast_ref::<T>())).ok();
|
||||
.and_then(|last| last.downcast_ref::<Element>())
|
||||
).ok();
|
||||
|
||||
|
||||
if elem.is_some() || !recurse { return elem }
|
||||
|
||||
match self.parent
|
||||
match self.parent()
|
||||
{
|
||||
None => None,
|
||||
Some(parent) => parent.last_element(true),
|
||||
}
|
||||
}
|
||||
|
||||
pub fn last_element_mut<T: Element>(&self, recurse: bool) -> Option<RefMut<'_, T>>
|
||||
fn last_element_mut(&'a self, recurse: bool) -> Option<RefMut<'_, dyn Element>>
|
||||
{
|
||||
let elem = RefMut::filter_map(self.content.borrow_mut(),
|
||||
|content| content.last_mut()
|
||||
.and_then(|last| last.downcast_mut::<T>())).ok();
|
||||
let elem = RefMut::filter_map(self.content().borrow_mut(),
|
||||
|content| content.last_mut()).ok();
|
||||
|
||||
if elem.is_some() || !recurse { return elem }
|
||||
|
||||
match self.parent
|
||||
match self.parent()
|
||||
{
|
||||
None => None,
|
||||
Some(parent) => parent.last_element_mut(true),
|
||||
}
|
||||
}
|
||||
*/
|
||||
|
||||
pub fn get_reference(&self, ref_name: &str) -> Option<(&Document<'a>, std::cell::Ref<'_, Box<dyn Element>>)> {
|
||||
match self.scope.borrow().referenceable.get(ref_name) {
|
||||
// Return if found
|
||||
Some(elem) => {
|
||||
return Some((&self,
|
||||
std::cell::Ref::map(self.content.borrow(),
|
||||
|m| &m[*elem])))
|
||||
},
|
||||
|
||||
// Continue search recursively
|
||||
None => match self.parent {
|
||||
Some(parent) => return parent.get_reference(ref_name),
|
||||
|
||||
// Not found
|
||||
None => return None,
|
||||
}
|
||||
}
|
||||
fn add_variable(&self, variable: Rc<dyn Variable>) {
|
||||
self.scope()
|
||||
.borrow_mut()
|
||||
.variables
|
||||
.insert(variable.name().to_string(), variable);
|
||||
}
|
||||
|
||||
pub fn add_variable(&self, variable: Rc<dyn Variable>)
|
||||
{
|
||||
self.scope.borrow_mut().variables.insert(
|
||||
variable.name().to_string(),
|
||||
variable);
|
||||
}
|
||||
|
||||
pub fn get_variable<S: AsRef<str>>(&self, name: S) -> Option<(&Document<'a>, Rc<dyn Variable>)>
|
||||
{
|
||||
match self.scope.borrow().variables.get(name.as_ref())
|
||||
{
|
||||
fn get_variable(&self, name: &str) -> Option<Rc<dyn Variable>> {
|
||||
match self.scope().borrow().variables.get(name) {
|
||||
Some(variable) => {
|
||||
return Some((&self, variable.clone()));
|
||||
},
|
||||
return Some(variable.clone());
|
||||
}
|
||||
|
||||
// Continue search recursively
|
||||
None => match self.parent {
|
||||
None => match self.parent() {
|
||||
Some(parent) => return parent.get_variable(name),
|
||||
|
||||
// Not found
|
||||
None => return None,
|
||||
}
|
||||
},
|
||||
}
|
||||
}
|
||||
|
||||
pub fn remove_variable<S: AsRef<str>>(&self, name: S) -> Option<(&Document<'a>, Rc<dyn Variable>)>
|
||||
/*
|
||||
fn remove_variable(&self, name: &str) -> Option<Rc<dyn Variable>>
|
||||
{
|
||||
match self.scope.borrow_mut().variables.remove(name.as_ref())
|
||||
match self.scope().borrow_mut().variables.remove(name)
|
||||
{
|
||||
Some(variable) => {
|
||||
return Some((&self, variable.clone()));
|
||||
return Some(variable.clone());
|
||||
},
|
||||
|
||||
// Continue search recursively
|
||||
None => match self.parent {
|
||||
None => match self.parent() {
|
||||
Some(parent) => return parent.remove_variable(name),
|
||||
|
||||
// Not found
|
||||
|
@ -186,25 +159,48 @@ impl<'a> Document<'a> {
|
|||
}
|
||||
}
|
||||
}
|
||||
*/
|
||||
|
||||
/// Merges [`other`] into [`self`]
|
||||
pub fn merge(&self, other: Document, merge_as: Option<&String>)
|
||||
{
|
||||
match merge_as
|
||||
{
|
||||
Some(merge_as) => self.scope.borrow_mut()
|
||||
.merge(
|
||||
&mut *other.scope.borrow_mut(),
|
||||
fn merge(
|
||||
&self,
|
||||
content: &RefCell<Vec<Box<dyn Element>>>,
|
||||
scope: &RefCell<Scope>,
|
||||
merge_as: Option<&String>,
|
||||
) {
|
||||
match merge_as {
|
||||
Some(merge_as) => self.scope().borrow_mut().merge(
|
||||
&mut *scope.borrow_mut(),
|
||||
merge_as,
|
||||
self.content.borrow().len()+1),
|
||||
_ => {},
|
||||
self.content().borrow().len() + 1,
|
||||
),
|
||||
_ => {}
|
||||
}
|
||||
|
||||
// Content
|
||||
self.content.borrow_mut().extend((other.content.borrow_mut())
|
||||
.drain(..)
|
||||
.map(|value| value));
|
||||
self.content()
|
||||
.borrow_mut()
|
||||
.extend((content.borrow_mut()).drain(..).map(|value| value));
|
||||
}
|
||||
}
|
||||
|
||||
pub trait DocumentAccessors<'a> {
|
||||
fn last_element<T: Element>(&self) -> Option<Ref<'_, T>>;
|
||||
fn last_element_mut<T: Element>(&self) -> Option<RefMut<'_, T>>;
|
||||
}
|
||||
|
||||
impl<'a> DocumentAccessors<'a> for dyn Document<'a> + '_ {
|
||||
fn last_element<T: Element>(&self) -> Option<Ref<'_, T>> {
|
||||
Ref::filter_map(self.content().borrow(), |content| {
|
||||
content.last().and_then(|last| last.downcast_ref::<T>())
|
||||
})
|
||||
.ok()
|
||||
}
|
||||
|
||||
fn last_element_mut<T: Element>(&self) -> Option<RefMut<'_, T>> {
|
||||
RefMut::filter_map(self.content().borrow_mut(), |content| {
|
||||
content.last_mut().and_then(|last| last.downcast_mut::<T>())
|
||||
})
|
||||
.ok()
|
||||
}
|
||||
}
|
||||
|
|
|
@ -48,7 +48,7 @@ pub trait Element: Downcast
|
|||
fn as_referenceable(&self) -> Option<&dyn ReferenceableElement> { None }
|
||||
|
||||
/// Compiles element
|
||||
fn compile(&self, compiler: &Compiler, document: &Document) -> Result<String, String>;
|
||||
fn compile(&self, compiler: &Compiler, document: &dyn Document) -> Result<String, String>;
|
||||
}
|
||||
impl_downcast!(Element);
|
||||
|
||||
|
|
38
src/document/langdocument.rs
Normal file
38
src/document/langdocument.rs
Normal file
|
@ -0,0 +1,38 @@
|
|||
use std::{cell::RefCell, rc::Rc};
|
||||
|
||||
use crate::parser::source::Source;
|
||||
|
||||
use super::{document::{Document, Scope}, element::Element};
|
||||
|
||||
#[derive(Debug)]
|
||||
pub struct LangDocument<'a> {
|
||||
source: Rc<dyn Source>,
|
||||
parent: Option<&'a dyn Document<'a>>, /// Document's parent
|
||||
|
||||
// FIXME: Render these fields private
|
||||
pub content: RefCell<Vec<Box<dyn Element>>>,
|
||||
pub scope: RefCell<Scope>,
|
||||
}
|
||||
|
||||
impl<'a> LangDocument<'a>
|
||||
{
|
||||
pub fn new(source: Rc<dyn Source>, parent: Option<&'a dyn Document<'a>>) -> Self
|
||||
{
|
||||
Self {
|
||||
source: source,
|
||||
parent: parent,
|
||||
content: RefCell::new(Vec::new()),
|
||||
scope: RefCell::new(Scope::new()),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl<'a> Document<'a> for LangDocument<'a> {
|
||||
fn source(&self) -> Rc<dyn Source> { self.source.clone() }
|
||||
|
||||
fn parent(&self) -> Option<&'a dyn Document<'a>> { self.parent.and_then(|p| Some(p as &dyn Document<'a>)) }
|
||||
|
||||
fn content(&self) -> &RefCell<Vec<Box<dyn Element>>> { &self.content }
|
||||
|
||||
fn scope(&self) -> &RefCell<Scope> { &self.scope }
|
||||
}
|
|
@ -1,3 +1,5 @@
|
|||
pub mod document;
|
||||
pub mod references;
|
||||
pub mod langdocument;
|
||||
pub mod element;
|
||||
pub mod variable;
|
||||
|
|
28
src/document/references.rs
Normal file
28
src/document/references.rs
Normal file
|
@ -0,0 +1,28 @@
|
|||
pub fn validate_refname(name: &str) -> Result<&str, String> {
|
||||
let trimmed = name.trim_start().trim_end();
|
||||
if trimmed.is_empty() {
|
||||
return Err("Refname cannot be empty".to_string());
|
||||
}
|
||||
|
||||
for c in trimmed.chars() {
|
||||
if c.is_ascii_punctuation() {
|
||||
return Err(format!(
|
||||
"Refname `{trimmed}` cannot contain punctuation codepoint: `{c}`"
|
||||
));
|
||||
}
|
||||
|
||||
if c.is_whitespace() {
|
||||
return Err(format!(
|
||||
"Refname `{trimmed}` cannot contain whitespaces: `{c}`"
|
||||
));
|
||||
}
|
||||
|
||||
if c.is_control() {
|
||||
return Err(format!(
|
||||
"Refname `{trimmed}` cannot contain control codepoint: `{c}`"
|
||||
));
|
||||
}
|
||||
}
|
||||
|
||||
Ok(trimmed)
|
||||
}
|
|
@ -1,6 +1,6 @@
|
|||
use std::{path::PathBuf, rc::Rc};
|
||||
use crate::{elements::text::Text, parser::{parser::Parser, source::{Source, Token, VirtualSource}}};
|
||||
use super::{document::Document};
|
||||
use super::document::Document;
|
||||
|
||||
|
||||
// TODO enforce to_string(from_string(to_string())) == to_string()
|
||||
|
@ -15,7 +15,7 @@ pub trait Variable
|
|||
/// Converts variable to a string
|
||||
fn to_string(&self) -> String;
|
||||
|
||||
fn parse<'a>(&self, location: Token, parser: &dyn Parser, document: &'a Document);
|
||||
fn parse<'a>(&self, location: Token, parser: &dyn Parser, document: &'a dyn Document<'a>);
|
||||
}
|
||||
|
||||
impl core::fmt::Debug for dyn Variable
|
||||
|
@ -52,7 +52,7 @@ impl Variable for BaseVariable
|
|||
|
||||
fn to_string(&self) -> String { self.value.clone() }
|
||||
|
||||
fn parse<'a>(&self, _location: Token, parser: &dyn Parser, document: &'a Document) {
|
||||
fn parse<'a>(&self, _location: Token, parser: &dyn Parser, document: &'a dyn Document<'a>) {
|
||||
let source = Rc::new(VirtualSource::new(
|
||||
self.location().clone(),
|
||||
self.name().to_string(),
|
||||
|
@ -90,12 +90,12 @@ impl Variable for PathVariable
|
|||
|
||||
fn to_string(&self) -> String { self.path.to_str().unwrap().to_string() }
|
||||
|
||||
fn parse<'a>(&self, location: Token, parser: &dyn Parser, document: &'a Document){
|
||||
// TODO: Avoid copying the location twice...
|
||||
fn parse<'a>(&self, location: Token, parser: &dyn Parser, document: &'a dyn Document) {
|
||||
// TODO: Avoid copying the content...
|
||||
// Maybe create a special VirtualSource where the `content()` method
|
||||
// calls `Variable::to_string()`
|
||||
let source = Rc::new(VirtualSource::new(
|
||||
location.clone(),
|
||||
location,
|
||||
self.name().to_string(),
|
||||
self.to_string()));
|
||||
|
||||
|
|
|
@ -1,25 +1,48 @@
|
|||
use std::{collections::HashMap, ops::Range, rc::Rc, sync::Once};
|
||||
use std::collections::HashMap;
|
||||
use std::ops::Range;
|
||||
use std::rc::Rc;
|
||||
use std::sync::Once;
|
||||
|
||||
use ariadne::{Fmt, Label, Report, ReportKind};
|
||||
use crypto::{digest::Digest, sha2::Sha512};
|
||||
use mlua::{Function, Lua};
|
||||
use regex::{Captures, Regex};
|
||||
use syntect::{easy::HighlightLines, highlighting::ThemeSet, parsing::SyntaxSet};
|
||||
use ariadne::Fmt;
|
||||
use ariadne::Label;
|
||||
use ariadne::Report;
|
||||
use ariadne::ReportKind;
|
||||
use crypto::digest::Digest;
|
||||
use crypto::sha2::Sha512;
|
||||
use mlua::Function;
|
||||
use mlua::Lua;
|
||||
use regex::Captures;
|
||||
use regex::Regex;
|
||||
use syntect::easy::HighlightLines;
|
||||
use syntect::highlighting::ThemeSet;
|
||||
use syntect::parsing::SyntaxSet;
|
||||
|
||||
use crate::{cache::cache::{Cached, CachedError}, compiler::compiler::{Compiler, Target}, document::{document::Document, element::{ElemKind, Element}}, parser::{parser::Parser, rule::RegexRule, source::{Source, Token}, util::{self, Property, PropertyParser}}};
|
||||
use crate::cache::cache::Cached;
|
||||
use crate::cache::cache::CachedError;
|
||||
use crate::compiler::compiler::Compiler;
|
||||
use crate::compiler::compiler::Target;
|
||||
use crate::document::document::Document;
|
||||
use crate::document::element::ElemKind;
|
||||
use crate::document::element::Element;
|
||||
use crate::parser::parser::Parser;
|
||||
use crate::parser::rule::RegexRule;
|
||||
use crate::parser::source::Source;
|
||||
use crate::parser::source::Token;
|
||||
use crate::parser::util::Property;
|
||||
use crate::parser::util::PropertyMapError;
|
||||
use crate::parser::util::PropertyParser;
|
||||
use crate::parser::util::{self};
|
||||
use lazy_static::lazy_static;
|
||||
|
||||
#[derive(Clone, Copy, Debug, PartialEq, Eq)]
|
||||
enum CodeKind
|
||||
{
|
||||
enum CodeKind {
|
||||
FullBlock,
|
||||
MiniBlock,
|
||||
Inline,
|
||||
}
|
||||
|
||||
#[derive(Debug)]
|
||||
struct Code
|
||||
{
|
||||
struct Code {
|
||||
location: Token,
|
||||
block: CodeKind,
|
||||
language: String,
|
||||
|
@ -30,57 +53,91 @@ struct Code
|
|||
}
|
||||
|
||||
impl Code {
|
||||
fn new(location: Token, block: CodeKind, language: String, name: Option<String>, code: String, theme: Option<String>, line_offset: usize) -> Self {
|
||||
Self { location, block, language, name, code, theme, line_offset }
|
||||
fn new(
|
||||
location: Token,
|
||||
block: CodeKind,
|
||||
language: String,
|
||||
name: Option<String>,
|
||||
code: String,
|
||||
theme: Option<String>,
|
||||
line_offset: usize,
|
||||
) -> Self {
|
||||
Self {
|
||||
location,
|
||||
block,
|
||||
language,
|
||||
name,
|
||||
code,
|
||||
theme,
|
||||
line_offset,
|
||||
}
|
||||
}
|
||||
|
||||
fn highlight_html(&self, compiler: &Compiler) -> Result<String, String>
|
||||
{
|
||||
fn highlight_html(&self, compiler: &Compiler) -> Result<String, String> {
|
||||
lazy_static! {
|
||||
static ref syntax_set : SyntaxSet = SyntaxSet::load_defaults_newlines();
|
||||
static ref theme_set : ThemeSet = ThemeSet::load_defaults();
|
||||
static ref syntax_set: SyntaxSet = SyntaxSet::load_defaults_newlines();
|
||||
static ref theme_set: ThemeSet = ThemeSet::load_defaults();
|
||||
}
|
||||
let syntax = match syntax_set.find_syntax_by_name(self.language.as_str())
|
||||
{
|
||||
let syntax = match syntax_set.find_syntax_by_name(self.language.as_str()) {
|
||||
Some(syntax) => syntax,
|
||||
None => return Err(format!("Unable to find syntax for language: {}", self.language))
|
||||
None => {
|
||||
return Err(format!(
|
||||
"Unable to find syntax for language: {}",
|
||||
self.language
|
||||
))
|
||||
}
|
||||
};
|
||||
|
||||
let theme_string = match self.theme.as_ref()
|
||||
{
|
||||
let theme_string = match self.theme.as_ref() {
|
||||
Some(theme) => theme.as_str(),
|
||||
None => "base16-ocean.dark",
|
||||
};
|
||||
let mut h = HighlightLines::new(syntax, &theme_set.themes[theme_string]);
|
||||
|
||||
let mut result = String::new();
|
||||
if self.block == CodeKind::FullBlock
|
||||
{
|
||||
if self.block == CodeKind::FullBlock {
|
||||
result += "<div class=\"code-block\">";
|
||||
if let Some(name) = &self.name
|
||||
{
|
||||
result += format!("<div class=\"code-block-title\">{}</div>",
|
||||
compiler.sanitize(name.as_str())).as_str();
|
||||
if let Some(name) = &self.name {
|
||||
result += format!(
|
||||
"<div class=\"code-block-title\">{}</div>",
|
||||
compiler.sanitize(name.as_str())
|
||||
)
|
||||
.as_str();
|
||||
}
|
||||
|
||||
result += format!("<div class=\"code-block-content\"><table cellspacing=\"0\">").as_str();
|
||||
for (line_id, line) in self.code.split(|c| c == '\n').enumerate()
|
||||
{
|
||||
result +=
|
||||
format!("<div class=\"code-block-content\"><table cellspacing=\"0\">").as_str();
|
||||
for (line_id, line) in self.code.split(|c| c == '\n').enumerate() {
|
||||
result += "<tr><td class=\"code-block-gutter\">";
|
||||
|
||||
// Line number
|
||||
result += format!("<pre><span>{}</span></pre>", line_id+self.line_offset).as_str();
|
||||
result +=
|
||||
format!("<pre><span>{}</span></pre>", line_id + self.line_offset).as_str();
|
||||
|
||||
// Code
|
||||
result += "</td><td class=\"code-block-line\"><pre>";
|
||||
match h.highlight_line(line, &syntax_set)
|
||||
{
|
||||
Err(e) => return Err(format!("Error highlighting line `{line}`: {}", e.to_string())),
|
||||
match h.highlight_line(line, &syntax_set) {
|
||||
Err(e) => {
|
||||
return Err(format!(
|
||||
"Error highlighting line `{line}`: {}",
|
||||
e.to_string()
|
||||
))
|
||||
}
|
||||
Ok(regions) => {
|
||||
match syntect::html::styled_line_to_highlighted_html(®ions[..], syntect::html::IncludeBackground::No)
|
||||
{
|
||||
Err(e) => return Err(format!("Error highlighting code: {}", e.to_string())),
|
||||
Ok(highlighted) => result += if highlighted.is_empty() { "<br>" } else { highlighted.as_str() }
|
||||
match syntect::html::styled_line_to_highlighted_html(
|
||||
®ions[..],
|
||||
syntect::html::IncludeBackground::No,
|
||||
) {
|
||||
Err(e) => {
|
||||
return Err(format!("Error highlighting code: {}", e.to_string()))
|
||||
}
|
||||
Ok(highlighted) => {
|
||||
result += if highlighted.is_empty() {
|
||||
"<br>"
|
||||
} else {
|
||||
highlighted.as_str()
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
@ -88,41 +145,59 @@ impl Code {
|
|||
}
|
||||
|
||||
result += "</table></div></div>";
|
||||
}
|
||||
else if self.block == CodeKind::MiniBlock
|
||||
{
|
||||
} else if self.block == CodeKind::MiniBlock {
|
||||
result += "<div class=\"code-block\"><div class=\"code-block-content\"><table cellspacing=\"0\">";
|
||||
|
||||
for line in self.code.split(|c| c == '\n')
|
||||
{
|
||||
for line in self.code.split(|c| c == '\n') {
|
||||
result += "<tr><td class=\"code-block-line\"><pre>";
|
||||
// Code
|
||||
match h.highlight_line(line, &syntax_set)
|
||||
{
|
||||
Err(e) => return Err(format!("Error highlighting line `{line}`: {}", e.to_string())),
|
||||
match h.highlight_line(line, &syntax_set) {
|
||||
Err(e) => {
|
||||
return Err(format!(
|
||||
"Error highlighting line `{line}`: {}",
|
||||
e.to_string()
|
||||
))
|
||||
}
|
||||
Ok(regions) => {
|
||||
match syntect::html::styled_line_to_highlighted_html(®ions[..], syntect::html::IncludeBackground::No)
|
||||
{
|
||||
Err(e) => return Err(format!("Error highlighting code: {}", e.to_string())),
|
||||
Ok(highlighted) => result += if highlighted.is_empty() { "<br>" } else { highlighted.as_str() }
|
||||
match syntect::html::styled_line_to_highlighted_html(
|
||||
®ions[..],
|
||||
syntect::html::IncludeBackground::No,
|
||||
) {
|
||||
Err(e) => {
|
||||
return Err(format!("Error highlighting code: {}", e.to_string()))
|
||||
}
|
||||
Ok(highlighted) => {
|
||||
result += if highlighted.is_empty() {
|
||||
"<br>"
|
||||
} else {
|
||||
highlighted.as_str()
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
result += "</pre></td></tr>";
|
||||
}
|
||||
result += "</table></div></div>";
|
||||
}
|
||||
else if self.block == CodeKind::Inline
|
||||
{
|
||||
} else if self.block == CodeKind::Inline {
|
||||
result += "<a class=\"inline-code\"><code>";
|
||||
match h.highlight_line(self.code.as_str(), &syntax_set)
|
||||
{
|
||||
Err(e) => return Err(format!("Error highlighting line `{}`: {}", self.code, e.to_string())),
|
||||
match h.highlight_line(self.code.as_str(), &syntax_set) {
|
||||
Err(e) => {
|
||||
return Err(format!(
|
||||
"Error highlighting line `{}`: {}",
|
||||
self.code,
|
||||
e.to_string()
|
||||
))
|
||||
}
|
||||
Ok(regions) => {
|
||||
match syntect::html::styled_line_to_highlighted_html(®ions[..], syntect::html::IncludeBackground::No)
|
||||
{
|
||||
Err(e) => return Err(format!("Error highlighting code: {}", e.to_string())),
|
||||
Ok(highlighted) => result += highlighted.as_str()
|
||||
match syntect::html::styled_line_to_highlighted_html(
|
||||
®ions[..],
|
||||
syntect::html::IncludeBackground::No,
|
||||
) {
|
||||
Err(e) => {
|
||||
return Err(format!("Error highlighting code: {}", e.to_string()))
|
||||
}
|
||||
Ok(highlighted) => result += highlighted.as_str(),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
@ -133,8 +208,7 @@ impl Code {
|
|||
}
|
||||
}
|
||||
|
||||
impl Cached for Code
|
||||
{
|
||||
impl Cached for Code {
|
||||
type Key = String;
|
||||
type Value = String;
|
||||
|
||||
|
@ -144,9 +218,7 @@ impl Cached for Code
|
|||
highlighted BLOB NOT NULL);"
|
||||
}
|
||||
|
||||
fn sql_get_query() -> &'static str {
|
||||
"SELECT highlighted FROM cached_code WHERE digest = (?1)"
|
||||
}
|
||||
fn sql_get_query() -> &'static str { "SELECT highlighted FROM cached_code WHERE digest = (?1)" }
|
||||
|
||||
fn sql_insert_query() -> &'static str {
|
||||
"INSERT INTO cached_code (digest, highlighted) VALUES (?1, ?2)"
|
||||
|
@ -156,7 +228,9 @@ impl Cached for Code
|
|||
let mut hasher = Sha512::new();
|
||||
hasher.input((self.block as usize).to_be_bytes().as_slice());
|
||||
hasher.input((self.line_offset as usize).to_be_bytes().as_slice());
|
||||
self.theme.as_ref().map(|theme| hasher.input(theme.as_bytes()));
|
||||
self.theme
|
||||
.as_ref()
|
||||
.map(|theme| hasher.input(theme.as_bytes()));
|
||||
self.name.as_ref().map(|name| hasher.input(name.as_bytes()));
|
||||
hasher.input(self.language.as_bytes());
|
||||
hasher.input(self.code.as_bytes());
|
||||
|
@ -168,44 +242,47 @@ impl Cached for Code
|
|||
impl Element for Code {
|
||||
fn location(&self) -> &Token { &self.location }
|
||||
|
||||
fn kind(&self) -> ElemKind { if self.block == CodeKind::Inline { ElemKind::Inline } else { ElemKind::Block } }
|
||||
fn kind(&self) -> ElemKind {
|
||||
if self.block == CodeKind::Inline {
|
||||
ElemKind::Inline
|
||||
} else {
|
||||
ElemKind::Block
|
||||
}
|
||||
}
|
||||
|
||||
fn element_name(&self) -> &'static str { "Code Block" }
|
||||
|
||||
fn to_string(&self) -> String { format!("{self:#?}") }
|
||||
|
||||
fn compile(&self, compiler: &Compiler, _document: &Document)
|
||||
-> Result<String, String> {
|
||||
|
||||
match compiler.target()
|
||||
{
|
||||
fn compile(&self, compiler: &Compiler, _document: &dyn Document) -> Result<String, String> {
|
||||
match compiler.target() {
|
||||
Target::HTML => {
|
||||
static CACHE_INIT : Once = Once::new();
|
||||
CACHE_INIT.call_once(|| if let Some(mut con) = compiler.cache() {
|
||||
if let Err(e) = Code::init(&mut con)
|
||||
{
|
||||
static CACHE_INIT: Once = Once::new();
|
||||
CACHE_INIT.call_once(|| {
|
||||
if let Some(mut con) = compiler.cache() {
|
||||
if let Err(e) = Code::init(&mut con) {
|
||||
eprintln!("Unable to create cache table: {e}");
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
if let Some(mut con) = compiler.cache()
|
||||
{
|
||||
match self.cached(&mut con, |s| s.highlight_html(compiler))
|
||||
{
|
||||
if let Some(mut con) = compiler.cache() {
|
||||
match self.cached(&mut con, |s| s.highlight_html(compiler)) {
|
||||
Ok(s) => Ok(s),
|
||||
Err(e) => match e
|
||||
{
|
||||
CachedError::SqlErr(e) => Err(format!("Querying the cache failed: {e}")),
|
||||
CachedError::GenErr(e) => Err(e)
|
||||
Err(e) => match e {
|
||||
CachedError::SqlErr(e) => {
|
||||
Err(format!("Querying the cache failed: {e}"))
|
||||
}
|
||||
CachedError::GenErr(e) => Err(e),
|
||||
},
|
||||
}
|
||||
}
|
||||
else
|
||||
{
|
||||
} else {
|
||||
self.highlight_html(compiler)
|
||||
}
|
||||
}
|
||||
Target::LATEX => { todo!("") }
|
||||
Target::LATEX => {
|
||||
todo!("")
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
@ -218,33 +295,46 @@ pub struct CodeRule {
|
|||
impl CodeRule {
|
||||
pub fn new() -> Self {
|
||||
let mut props = HashMap::new();
|
||||
props.insert("line_offset".to_string(),
|
||||
props.insert(
|
||||
"line_offset".to_string(),
|
||||
Property::new(
|
||||
true,
|
||||
"Line number offset".to_string(),
|
||||
Some("1".to_string())));
|
||||
Some("1".to_string()),
|
||||
),
|
||||
);
|
||||
Self {
|
||||
re: [
|
||||
Regex::new(r"(?:^|\n)```(?:\[((?:\\.|[^\\\\])*?)\])?(.*?)(?:,(.*))?\n((?:\\(?:.|\n)|[^\\\\])*?)```").unwrap(),
|
||||
Regex::new(r"``(?:\[((?:\\.|[^\[\]\\])*?)\])?(?:(.*)(?:\n|,))?((?:\\(?:.|\n)|[^\\\\])*?)``").unwrap(),
|
||||
Regex::new(
|
||||
r"(?:^|\n)```(?:\[((?:\\.|[^\\\\])*?)\])?(.*?)(?:,(.*))?\n((?:\\(?:.|\n)|[^\\\\])*?)```",
|
||||
)
|
||||
.unwrap(),
|
||||
Regex::new(
|
||||
r"``(?:\[((?:\\.|[^\[\]\\])*?)\])?(?:(.*?)(?:\n|,))?((?:\\(?:.|\n)|[^\\\\])*?)``",
|
||||
)
|
||||
.unwrap(),
|
||||
],
|
||||
properties: PropertyParser::new(props)
|
||||
properties: PropertyParser::new(props),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl RegexRule for CodeRule
|
||||
{
|
||||
impl RegexRule for CodeRule {
|
||||
fn name(&self) -> &'static str { "Code" }
|
||||
|
||||
fn regexes(&self) -> &[regex::Regex] { &self.re }
|
||||
|
||||
fn on_regex_match(&self, index: usize, parser: &dyn Parser, document: &Document, token: Token, matches: Captures)
|
||||
-> Vec<Report<'_, (Rc<dyn Source>, Range<usize>)>> {
|
||||
fn on_regex_match<'a>(
|
||||
&self,
|
||||
index: usize,
|
||||
parser: &dyn Parser,
|
||||
document: &'a dyn Document,
|
||||
token: Token,
|
||||
matches: Captures,
|
||||
) -> Vec<Report<'_, (Rc<dyn Source>, Range<usize>)>> {
|
||||
let mut reports = vec![];
|
||||
|
||||
let properties = match matches.get(1)
|
||||
{
|
||||
let properties = match matches.get(1) {
|
||||
None => match self.properties.default() {
|
||||
Ok(properties) => properties,
|
||||
Err(e) => {
|
||||
|
@ -254,16 +344,17 @@ impl RegexRule for CodeRule
|
|||
.with_label(
|
||||
Label::new((token.source().clone(), token.range.clone()))
|
||||
.with_message(format!("Code is missing properties: {e}"))
|
||||
.with_color(parser.colors().error))
|
||||
.finish());
|
||||
.with_color(parser.colors().error),
|
||||
)
|
||||
.finish(),
|
||||
);
|
||||
return reports;
|
||||
},
|
||||
}
|
||||
},
|
||||
Some(props) => {
|
||||
let processed = util::process_escaped('\\', "]",
|
||||
props.as_str().trim_start().trim_end());
|
||||
match self.properties.parse(processed.as_str())
|
||||
{
|
||||
let processed =
|
||||
util::process_escaped('\\', "]", props.as_str().trim_start().trim_end());
|
||||
match self.properties.parse(processed.as_str()) {
|
||||
Err(e) => {
|
||||
reports.push(
|
||||
Report::build(ReportKind::Error, token.source(), props.start())
|
||||
|
@ -271,30 +362,32 @@ impl RegexRule for CodeRule
|
|||
.with_label(
|
||||
Label::new((token.source().clone(), props.range()))
|
||||
.with_message(e)
|
||||
.with_color(parser.colors().error))
|
||||
.finish());
|
||||
.with_color(parser.colors().error),
|
||||
)
|
||||
.finish(),
|
||||
);
|
||||
return reports;
|
||||
}
|
||||
Ok(properties) => properties
|
||||
Ok(properties) => properties,
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
let code_lang = match matches.get(2)
|
||||
{
|
||||
let code_lang = match matches.get(2) {
|
||||
None => "Plain Text".to_string(),
|
||||
Some(lang) => {
|
||||
let code_lang = lang.as_str().trim_end().trim_start().to_string();
|
||||
if code_lang.is_empty()
|
||||
{
|
||||
if code_lang.is_empty() {
|
||||
reports.push(
|
||||
Report::build(ReportKind::Error, token.source(), lang.start())
|
||||
.with_message("Missing code language")
|
||||
.with_label(
|
||||
Label::new((token.source().clone(), lang.range()))
|
||||
.with_message("No language specified")
|
||||
.with_color(parser.colors().error))
|
||||
.finish());
|
||||
.with_color(parser.colors().error),
|
||||
)
|
||||
.finish(),
|
||||
);
|
||||
|
||||
return reports;
|
||||
}
|
||||
|
@ -305,43 +398,50 @@ impl RegexRule for CodeRule
|
|||
}
|
||||
};
|
||||
|
||||
let mut code_content = if index == 0
|
||||
{ util::process_escaped('\\',"```", matches.get(4).unwrap().as_str()) }
|
||||
else
|
||||
{ util::process_escaped('\\',"``", matches.get(3).unwrap().as_str()) };
|
||||
if code_content.bytes().last() == Some('\n' as u8) // Remove newline
|
||||
let mut code_content = if index == 0 {
|
||||
util::process_escaped('\\', "```", matches.get(4).unwrap().as_str())
|
||||
} else {
|
||||
util::process_escaped('\\', "``", matches.get(3).unwrap().as_str())
|
||||
};
|
||||
if code_content.bytes().last() == Some('\n' as u8)
|
||||
// Remove newline
|
||||
{
|
||||
code_content.pop();
|
||||
}
|
||||
|
||||
if code_content.is_empty()
|
||||
{
|
||||
if code_content.is_empty() {
|
||||
reports.push(
|
||||
Report::build(ReportKind::Error, token.source(), token.start())
|
||||
.with_message("Missing code content")
|
||||
.with_label(
|
||||
Label::new((token.source().clone(), token.range.clone()))
|
||||
.with_message("Code content cannot be empty")
|
||||
.with_color(parser.colors().error))
|
||||
.finish());
|
||||
.with_color(parser.colors().error),
|
||||
)
|
||||
.finish(),
|
||||
);
|
||||
return reports;
|
||||
}
|
||||
|
||||
let theme = document.get_variable("code.theme")
|
||||
.and_then(|(_doc, var)| Some(var.to_string()));
|
||||
let theme = document
|
||||
.get_variable("code.theme")
|
||||
.and_then(|var| Some(var.to_string()));
|
||||
|
||||
if index == 0 // Block
|
||||
if index == 0
|
||||
// Block
|
||||
{
|
||||
let code_name = matches.get(3)
|
||||
.and_then(|name| {
|
||||
let code_name = matches.get(3).and_then(|name| {
|
||||
let code_name = name.as_str().trim_end().trim_start().to_string();
|
||||
(!code_name.is_empty()).then_some(code_name)
|
||||
});
|
||||
let line_offset = match properties.get("line_offset",
|
||||
|prop, value| value.parse::<usize>().map_err(|e| (prop, e)))
|
||||
{
|
||||
let line_offset =
|
||||
match properties.get("line_offset", |prop, value| {
|
||||
value.parse::<usize>().map_err(|e| (prop, e))
|
||||
}) {
|
||||
Ok((_prop, offset)) => offset,
|
||||
Err((prop, e)) => {
|
||||
Err(e) => {
|
||||
match e {
|
||||
PropertyMapError::ParseError((prop, err)) => {
|
||||
reports.push(
|
||||
Report::build(ReportKind::Error, token.source(), token.start())
|
||||
.with_message("Invalid Code Property")
|
||||
|
@ -349,32 +449,58 @@ impl RegexRule for CodeRule
|
|||
Label::new((token.source().clone(), token.start()+1..token.end()))
|
||||
.with_message(format!("Property `line_offset: {}` cannot be converted: {}",
|
||||
prop.fg(parser.colors().info),
|
||||
e.fg(parser.colors().error)))
|
||||
err.fg(parser.colors().error)))
|
||||
.with_color(parser.colors().warning))
|
||||
.finish());
|
||||
return reports;
|
||||
}
|
||||
PropertyMapError::NotFoundError(err) => {
|
||||
reports.push(
|
||||
Report::build(ReportKind::Error, token.source(), token.start())
|
||||
.with_message("Invalid Code Property")
|
||||
.with_label(
|
||||
Label::new((
|
||||
token.source().clone(),
|
||||
token.start() + 1..token.end(),
|
||||
))
|
||||
.with_message(format!(
|
||||
"Property `{}` doesn't exist",
|
||||
err.fg(parser.colors().info)
|
||||
))
|
||||
.with_color(parser.colors().warning),
|
||||
)
|
||||
.finish(),
|
||||
);
|
||||
return reports;
|
||||
}
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
parser.push(document, Box::new(
|
||||
Code::new(
|
||||
parser.push(
|
||||
document,
|
||||
Box::new(Code::new(
|
||||
token.clone(),
|
||||
CodeKind::FullBlock,
|
||||
code_lang,
|
||||
code_name,
|
||||
code_content,
|
||||
theme,
|
||||
line_offset
|
||||
)
|
||||
));
|
||||
}
|
||||
else // Maybe inline
|
||||
line_offset,
|
||||
)),
|
||||
);
|
||||
} else
|
||||
// Maybe inline
|
||||
{
|
||||
let block = if code_content.contains('\n') { CodeKind::MiniBlock }
|
||||
else { CodeKind::Inline };
|
||||
let block = if code_content.contains('\n') {
|
||||
CodeKind::MiniBlock
|
||||
} else {
|
||||
CodeKind::Inline
|
||||
};
|
||||
|
||||
parser.push(document, Box::new(
|
||||
Code::new(
|
||||
parser.push(
|
||||
document,
|
||||
Box::new(Code::new(
|
||||
token.clone(),
|
||||
block,
|
||||
code_lang,
|
||||
|
@ -382,8 +508,8 @@ impl RegexRule for CodeRule
|
|||
code_content,
|
||||
theme,
|
||||
1,
|
||||
)
|
||||
));
|
||||
)),
|
||||
);
|
||||
}
|
||||
|
||||
reports
|
||||
|
|
|
@ -1,8 +1,8 @@
|
|||
use mlua::{Function, Lua};
|
||||
use regex::{Captures, Regex};
|
||||
use crate::parser::{parser::Parser, rule::RegexRule, source::{Source, Token}};
|
||||
use crate::{document::document::Document, parser::{parser::Parser, rule::RegexRule, source::{Source, Token}}};
|
||||
use ariadne::{Report, Label, ReportKind};
|
||||
use crate::{compiler::compiler::Compiler, document::{document::Document, element::{ElemKind, Element}}};
|
||||
use crate::{compiler::compiler::Compiler, document::element::{ElemKind, Element}};
|
||||
use std::{ops::Range, rc::Rc};
|
||||
|
||||
#[derive(Debug)]
|
||||
|
@ -24,7 +24,7 @@ impl Element for Comment
|
|||
fn kind(&self) -> ElemKind { ElemKind::Invisible }
|
||||
fn element_name(&self) -> &'static str { "Comment" }
|
||||
fn to_string(&self) -> String { format!("{self:#?}") }
|
||||
fn compile(&self, _compiler: &Compiler, _document: &Document)
|
||||
fn compile(&self, _compiler: &Compiler, _document: &dyn Document)
|
||||
-> Result<String, String> {
|
||||
Ok("".to_string())
|
||||
}
|
||||
|
@ -45,7 +45,7 @@ impl RegexRule for CommentRule {
|
|||
|
||||
fn regexes(&self) -> &[Regex] { &self.re }
|
||||
|
||||
fn on_regex_match(&self, _: usize, parser: &dyn Parser, document: &Document, token: Token, matches: Captures)
|
||||
fn on_regex_match<'a>(&self, _: usize, parser: &dyn Parser, document: &'a dyn Document, token: Token, matches: Captures)
|
||||
-> Vec<Report<'_, (Rc<dyn Source>, Range<usize>)>> {
|
||||
let mut reports = vec![];
|
||||
|
||||
|
|
375
src/elements/graphviz.rs
Normal file
375
src/elements/graphviz.rs
Normal file
|
@ -0,0 +1,375 @@
|
|||
use std::collections::HashMap;
|
||||
use std::ops::Range;
|
||||
use std::rc::Rc;
|
||||
use std::sync::Once;
|
||||
|
||||
use crate::parser::util::Property;
|
||||
use crate::parser::util::PropertyMapError;
|
||||
use crate::parser::util::PropertyParser;
|
||||
use ariadne::Fmt;
|
||||
use ariadne::Label;
|
||||
use ariadne::Report;
|
||||
use ariadne::ReportKind;
|
||||
use crypto::digest::Digest;
|
||||
use crypto::sha2::Sha512;
|
||||
use graphviz_rust::cmd::Format;
|
||||
use graphviz_rust::cmd::Layout;
|
||||
use graphviz_rust::exec_dot;
|
||||
use mlua::Function;
|
||||
use mlua::Lua;
|
||||
use regex::Captures;
|
||||
use regex::Regex;
|
||||
|
||||
use crate::cache::cache::Cached;
|
||||
use crate::cache::cache::CachedError;
|
||||
use crate::compiler::compiler::Compiler;
|
||||
use crate::compiler::compiler::Target;
|
||||
use crate::document::document::Document;
|
||||
use crate::document::element::ElemKind;
|
||||
use crate::document::element::Element;
|
||||
use crate::parser::parser::Parser;
|
||||
use crate::parser::rule::RegexRule;
|
||||
use crate::parser::source::Source;
|
||||
use crate::parser::source::Token;
|
||||
use crate::parser::util;
|
||||
|
||||
#[derive(Debug)]
|
||||
struct Graphviz {
|
||||
pub location: Token,
|
||||
pub dot: String,
|
||||
pub layout: Layout,
|
||||
pub width: String,
|
||||
pub caption: Option<String>,
|
||||
}
|
||||
|
||||
fn layout_from_str(value: &str) -> Result<Layout, String> {
|
||||
match value {
|
||||
"dot" => Ok(Layout::Dot),
|
||||
"neato" => Ok(Layout::Neato),
|
||||
"fdp" => Ok(Layout::Fdp),
|
||||
"sfdp" => Ok(Layout::Sfdp),
|
||||
"circo" => Ok(Layout::Circo),
|
||||
"twopi" => Ok(Layout::Twopi),
|
||||
"osage" => Ok(Layout::Asage), // typo in graphviz_rust ?
|
||||
"patchwork" => Ok(Layout::Patchwork),
|
||||
_ => Err(format!("Unknown layout: {value}")),
|
||||
}
|
||||
}
|
||||
|
||||
impl Graphviz {
|
||||
/// Renders dot to svg
|
||||
fn dot_to_svg(&self) -> Result<String, String> {
|
||||
print!("Rendering Graphviz `{}`... ", self.dot);
|
||||
|
||||
let svg = match exec_dot(
|
||||
self.dot.clone(),
|
||||
vec![self.layout.into(), Format::Svg.into()],
|
||||
) {
|
||||
Ok(svg) => {
|
||||
let out = String::from_utf8_lossy(svg.as_slice());
|
||||
let svg_start = out.find("<svg").unwrap(); // Remove svg header
|
||||
let split_at = out.split_at(svg_start).1.find('\n').unwrap();
|
||||
|
||||
let mut result = format!("<svg width=\"{}\"", self.width);
|
||||
result.push_str(out.split_at(svg_start+split_at).1);
|
||||
|
||||
result
|
||||
}
|
||||
Err(e) => return Err(format!("Unable to execute dot: {e}")),
|
||||
};
|
||||
println!("Done!");
|
||||
|
||||
Ok(svg)
|
||||
}
|
||||
}
|
||||
|
||||
impl Cached for Graphviz {
|
||||
type Key = String;
|
||||
type Value = String;
|
||||
|
||||
fn sql_table() -> &'static str {
|
||||
"CREATE TABLE IF NOT EXISTS cached_dot (
|
||||
digest TEXT PRIMARY KEY,
|
||||
svg BLOB NOT NULL);"
|
||||
}
|
||||
|
||||
fn sql_get_query() -> &'static str { "SELECT svg FROM cached_dot WHERE digest = (?1)" }
|
||||
|
||||
fn sql_insert_query() -> &'static str { "INSERT INTO cached_dot (digest, svg) VALUES (?1, ?2)" }
|
||||
|
||||
fn key(&self) -> <Self as Cached>::Key {
|
||||
let mut hasher = Sha512::new();
|
||||
hasher.input((self.layout as usize).to_be_bytes().as_slice());
|
||||
hasher.input(self.dot.as_bytes());
|
||||
|
||||
hasher.result_str()
|
||||
}
|
||||
}
|
||||
|
||||
impl Element for Graphviz {
|
||||
fn location(&self) -> &Token { &self.location }
|
||||
|
||||
fn kind(&self) -> ElemKind { ElemKind::Block }
|
||||
|
||||
fn element_name(&self) -> &'static str { "Graphviz" }
|
||||
|
||||
fn to_string(&self) -> String { format!("{self:#?}") }
|
||||
|
||||
fn compile(&self, compiler: &Compiler, _document: &dyn Document) -> Result<String, String> {
|
||||
match compiler.target() {
|
||||
Target::HTML => {
|
||||
static CACHE_INIT: Once = Once::new();
|
||||
CACHE_INIT.call_once(|| {
|
||||
if let Some(mut con) = compiler.cache() {
|
||||
if let Err(e) = Graphviz::init(&mut con) {
|
||||
eprintln!("Unable to create cache table: {e}");
|
||||
}
|
||||
}
|
||||
});
|
||||
// TODO: Format svg in a div
|
||||
|
||||
if let Some(mut con) = compiler.cache() {
|
||||
match self.cached(&mut con, |s| s.dot_to_svg()) {
|
||||
Ok(s) => Ok(s),
|
||||
Err(e) => match e {
|
||||
CachedError::SqlErr(e) => {
|
||||
Err(format!("Querying the cache failed: {e}"))
|
||||
}
|
||||
CachedError::GenErr(e) => Err(e),
|
||||
},
|
||||
}
|
||||
} else {
|
||||
match self.dot_to_svg() {
|
||||
Ok(svg) => Ok(svg),
|
||||
Err(e) => Err(e),
|
||||
}
|
||||
}
|
||||
}
|
||||
_ => todo!("Unimplemented"),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
pub struct GraphRule {
|
||||
re: [Regex; 1],
|
||||
properties: PropertyParser,
|
||||
}
|
||||
|
||||
impl GraphRule {
|
||||
pub fn new() -> Self {
|
||||
let mut props = HashMap::new();
|
||||
props.insert(
|
||||
"layout".to_string(),
|
||||
Property::new(
|
||||
true,
|
||||
"Graphviz layout engine see <https://graphviz.org/docs/layouts/>".to_string(),
|
||||
Some("dot".to_string()),
|
||||
),
|
||||
);
|
||||
props.insert(
|
||||
"width".to_string(),
|
||||
Property::new(
|
||||
true,
|
||||
"SVG width".to_string(),
|
||||
Some("100%".to_string()),
|
||||
),
|
||||
);
|
||||
Self {
|
||||
re: [Regex::new(
|
||||
r"\[graph\](?:\[((?:\\.|[^\[\]\\])*?)\])?(?:((?:\\.|[^\\\\])*?)\[/graph\])?",
|
||||
)
|
||||
.unwrap()],
|
||||
properties: PropertyParser::new(props),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl RegexRule for GraphRule {
|
||||
fn name(&self) -> &'static str { "Graph" }
|
||||
|
||||
fn regexes(&self) -> &[regex::Regex] { &self.re }
|
||||
|
||||
fn on_regex_match(
|
||||
&self,
|
||||
_: usize,
|
||||
parser: &dyn Parser,
|
||||
document: &dyn Document,
|
||||
token: Token,
|
||||
matches: Captures,
|
||||
) -> Vec<Report<'_, (Rc<dyn Source>, Range<usize>)>> {
|
||||
let mut reports = vec![];
|
||||
|
||||
let graph_content = match matches.get(2) {
|
||||
// Unterminated `[graph]`
|
||||
None => {
|
||||
reports.push(
|
||||
Report::build(ReportKind::Error, token.source(), token.start())
|
||||
.with_message("Unterminated Graph Code")
|
||||
.with_label(
|
||||
Label::new((token.source().clone(), token.range.clone()))
|
||||
.with_message(format!(
|
||||
"Missing terminating `{}` after first `{}`",
|
||||
"[/graph]".fg(parser.colors().info),
|
||||
"[graph]".fg(parser.colors().info)
|
||||
))
|
||||
.with_color(parser.colors().error),
|
||||
)
|
||||
.finish(),
|
||||
);
|
||||
return reports;
|
||||
}
|
||||
Some(content) => {
|
||||
let processed = util::process_escaped(
|
||||
'\\',
|
||||
"[/graph]",
|
||||
content.as_str().trim_start().trim_end(),
|
||||
);
|
||||
|
||||
if processed.is_empty() {
|
||||
reports.push(
|
||||
Report::build(ReportKind::Error, token.source(), content.start())
|
||||
.with_message("Empty Graph Code")
|
||||
.with_label(
|
||||
Label::new((token.source().clone(), content.range()))
|
||||
.with_message("Graph code is empty")
|
||||
.with_color(parser.colors().error),
|
||||
)
|
||||
.finish(),
|
||||
);
|
||||
return reports;
|
||||
}
|
||||
processed
|
||||
}
|
||||
};
|
||||
|
||||
// Properties
|
||||
let properties = match matches.get(1) {
|
||||
None => match self.properties.default() {
|
||||
Ok(properties) => properties,
|
||||
Err(e) => {
|
||||
reports.push(
|
||||
Report::build(ReportKind::Error, token.source(), token.start())
|
||||
.with_message("Invalid Graph")
|
||||
.with_label(
|
||||
Label::new((token.source().clone(), token.range.clone()))
|
||||
.with_message(format!("Graph is missing property: {e}"))
|
||||
.with_color(parser.colors().error),
|
||||
)
|
||||
.finish(),
|
||||
);
|
||||
return reports;
|
||||
}
|
||||
},
|
||||
Some(props) => {
|
||||
let processed =
|
||||
util::process_escaped('\\', "]", props.as_str().trim_start().trim_end());
|
||||
match self.properties.parse(processed.as_str()) {
|
||||
Err(e) => {
|
||||
reports.push(
|
||||
Report::build(ReportKind::Error, token.source(), props.start())
|
||||
.with_message("Invalid Graph Properties")
|
||||
.with_label(
|
||||
Label::new((token.source().clone(), props.range()))
|
||||
.with_message(e)
|
||||
.with_color(parser.colors().error),
|
||||
)
|
||||
.finish(),
|
||||
);
|
||||
return reports;
|
||||
}
|
||||
Ok(properties) => properties,
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
// Property "layout"
|
||||
let graph_layout = match properties.get("layout", |prop, value| {
|
||||
layout_from_str(value.as_str()).map_err(|e| (prop, e))
|
||||
}) {
|
||||
Ok((_prop, kind)) => kind,
|
||||
Err(e) => match e {
|
||||
PropertyMapError::ParseError((prop, err)) => {
|
||||
reports.push(
|
||||
Report::build(ReportKind::Error, token.source(), token.start())
|
||||
.with_message("Invalid Graph Property")
|
||||
.with_label(
|
||||
Label::new((token.source().clone(), token.range.clone()))
|
||||
.with_message(format!(
|
||||
"Property `layout: {}` cannot be converted: {}",
|
||||
prop.fg(parser.colors().info),
|
||||
err.fg(parser.colors().error)
|
||||
))
|
||||
.with_color(parser.colors().warning),
|
||||
)
|
||||
.finish(),
|
||||
);
|
||||
return reports;
|
||||
}
|
||||
PropertyMapError::NotFoundError(err) => {
|
||||
reports.push(
|
||||
Report::build(ReportKind::Error, token.source(), token.start())
|
||||
.with_message("Invalid Graph Property")
|
||||
.with_label(
|
||||
Label::new((
|
||||
token.source().clone(),
|
||||
token.start() + 1..token.end(),
|
||||
))
|
||||
.with_message(err)
|
||||
.with_color(parser.colors().warning),
|
||||
)
|
||||
.finish(),
|
||||
);
|
||||
return reports;
|
||||
}
|
||||
},
|
||||
};
|
||||
|
||||
// FIXME: You can escape html, make sure we escape single "
|
||||
// Property "width"
|
||||
let graph_width = match properties.get("width", |_, value| -> Result<String, ()> {
|
||||
Ok(value.clone())
|
||||
}) {
|
||||
Ok((_, kind)) => kind,
|
||||
Err(e) => match e {
|
||||
PropertyMapError::NotFoundError(err) => {
|
||||
reports.push(
|
||||
Report::build(ReportKind::Error, token.source(), token.start())
|
||||
.with_message("Invalid Graph Property")
|
||||
.with_label(
|
||||
Label::new((
|
||||
token.source().clone(),
|
||||
token.start() + 1..token.end(),
|
||||
))
|
||||
.with_message(format!(
|
||||
"Property `{}` is missing",
|
||||
err.fg(parser.colors().info)
|
||||
))
|
||||
.with_color(parser.colors().warning),
|
||||
)
|
||||
.finish(),
|
||||
);
|
||||
return reports;
|
||||
}
|
||||
_ => panic!("Unknown error")
|
||||
},
|
||||
};
|
||||
|
||||
// TODO: Caption
|
||||
|
||||
parser.push(
|
||||
document,
|
||||
Box::new(Graphviz {
|
||||
location: token,
|
||||
dot: graph_content,
|
||||
layout: graph_layout,
|
||||
width: graph_width,
|
||||
caption: None,
|
||||
}),
|
||||
);
|
||||
|
||||
reports
|
||||
}
|
||||
|
||||
// TODO
|
||||
fn lua_bindings<'lua>(&self, _lua: &'lua Lua) -> Vec<(String, Function<'lua>)> { vec![] }
|
||||
}
|
|
@ -1,8 +1,7 @@
|
|||
use mlua::{Function, Lua};
|
||||
use regex::Regex;
|
||||
use crate::parser::{parser::{Parser, ReportColors}, rule::RegexRule, source::{Source, SourceFile, Token}};
|
||||
use regex::{Captures, Regex};
|
||||
use crate::{document::document::{DocumentAccessors, Document}, parser::{parser::{Parser, ReportColors}, rule::RegexRule, source::{Source, SourceFile, Token}}};
|
||||
use ariadne::{Report, Fmt, Label, ReportKind};
|
||||
use crate::document::document::Document;
|
||||
use std::{ops::Range, rc::Rc};
|
||||
|
||||
use super::paragraph::Paragraph;
|
||||
|
@ -35,8 +34,8 @@ impl RegexRule for ImportRule {
|
|||
|
||||
fn regexes(&self) -> &[Regex] { &self.re }
|
||||
|
||||
fn on_regex_match(&self, _: usize, parser: &dyn Parser, document: &Document, token: Token, matches: regex::Captures) -> Vec<Report<'_, (Rc<dyn Source>, Range<usize>)>>
|
||||
{
|
||||
fn on_regex_match<'a>(&self, _: usize, parser: &dyn Parser, document: &'a dyn Document<'a>, token: Token, matches: Captures)
|
||||
-> Vec<Report<'_, (Rc<dyn Source>, Range<usize>)>> {
|
||||
let mut result = vec![];
|
||||
|
||||
// Path
|
||||
|
@ -138,13 +137,11 @@ impl RegexRule for ImportRule {
|
|||
}
|
||||
};
|
||||
|
||||
// TODO
|
||||
let import_doc = parser.parse(import, Some(&document));
|
||||
|
||||
document.merge(import_doc, Some(&import_as));
|
||||
let import_doc = parser.parse(import, Some(document));
|
||||
document.merge(import_doc.content(), import_doc.scope(), Some(&import_as));
|
||||
|
||||
// Close paragraph
|
||||
if document.last_element::<Paragraph>(false).is_some()
|
||||
if document.last_element::<Paragraph>().is_some()
|
||||
{
|
||||
parser.push(document, Box::new(Paragraph::new(
|
||||
Token::new(token.end()..token.end(), token.source())
|
||||
|
|
|
@ -1,10 +1,23 @@
|
|||
use mlua::{Function, Lua};
|
||||
use crate::compiler::compiler::Compiler;
|
||||
use crate::compiler::compiler::Target;
|
||||
use crate::document::document::Document;
|
||||
use crate::document::element::ElemKind;
|
||||
use crate::document::element::Element;
|
||||
use crate::parser::parser::Parser;
|
||||
use crate::parser::rule::RegexRule;
|
||||
use crate::parser::source::Source;
|
||||
use crate::parser::source::Token;
|
||||
use crate::parser::util;
|
||||
use ariadne::Fmt;
|
||||
use ariadne::Label;
|
||||
use ariadne::Report;
|
||||
use ariadne::ReportKind;
|
||||
use mlua::Function;
|
||||
use mlua::Lua;
|
||||
use regex::Captures;
|
||||
use regex::Regex;
|
||||
use serde::{Deserialize, Serialize};
|
||||
use crate::parser::{parser::Parser, rule::RegexRule, source::{Source, Token}, util};
|
||||
use ariadne::{Report, Fmt, Label, ReportKind};
|
||||
use crate::{compiler::compiler::{Compiler, Target}, document::{document::Document, element::{ElemKind, Element}}};
|
||||
use std::{ops::Range, rc::Rc};
|
||||
use std::ops::Range;
|
||||
use std::rc::Rc;
|
||||
|
||||
#[derive(Debug)]
|
||||
pub struct Link {
|
||||
|
@ -13,34 +26,33 @@ pub struct Link {
|
|||
url: String, // Link url
|
||||
}
|
||||
|
||||
impl Link
|
||||
{
|
||||
impl Link {
|
||||
pub fn new(location: Token, name: String, url: String) -> Self {
|
||||
Self { location: location, name, url }
|
||||
Self {
|
||||
location: location,
|
||||
name,
|
||||
url,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl Element for Link
|
||||
{
|
||||
impl Element for Link {
|
||||
fn location(&self) -> &Token { &self.location }
|
||||
fn kind(&self) -> ElemKind { ElemKind::Inline }
|
||||
fn element_name(&self) -> &'static str { "Link" }
|
||||
fn to_string(&self) -> String { format!("{self:#?}") }
|
||||
fn compile(&self, compiler: &Compiler, _document: &Document) -> Result<String, String> {
|
||||
match compiler.target()
|
||||
{
|
||||
Target::HTML => {
|
||||
Ok(format!("<a href=\"{}\">{}</a>",
|
||||
fn compile(&self, compiler: &Compiler, _document: &dyn Document) -> Result<String, String> {
|
||||
match compiler.target() {
|
||||
Target::HTML => Ok(format!(
|
||||
"<a href=\"{}\">{}</a>",
|
||||
compiler.sanitize(self.url.as_str()),
|
||||
compiler.sanitize(self.name.as_str()),
|
||||
))
|
||||
},
|
||||
Target::LATEX => {
|
||||
Ok(format!("\\href{{{}}}{{{}}}",
|
||||
)),
|
||||
Target::LATEX => Ok(format!(
|
||||
"\\href{{{}}}{{{}}}",
|
||||
compiler.sanitize(self.url.as_str()),
|
||||
compiler.sanitize(self.name.as_str()),
|
||||
))
|
||||
},
|
||||
)),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
@ -51,7 +63,9 @@ pub struct LinkRule {
|
|||
|
||||
impl LinkRule {
|
||||
pub fn new() -> Self {
|
||||
Self { re: [Regex::new(r"(?:^|\n)```(.*?)(?:,(.*))?\n((?:\\.|[^\[\]\\])*?)```").unwrap()] }
|
||||
Self {
|
||||
re: [Regex::new(r"\[((?:\\.|[^\\\\])*?)\]\(((?:\\.|[^\\\\])*?)\)").unwrap()],
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
|
@ -60,91 +74,98 @@ impl RegexRule for LinkRule {
|
|||
|
||||
fn regexes(&self) -> &[Regex] { &self.re }
|
||||
|
||||
fn on_regex_match(&self, _: usize, parser: &dyn Parser, document: &Document, token: Token, matches: regex::Captures) -> Vec<Report<'_, (Rc<dyn Source>, Range<usize>)>>
|
||||
{
|
||||
fn on_regex_match<'a>(
|
||||
&self,
|
||||
_: usize,
|
||||
parser: &dyn Parser,
|
||||
document: &'a dyn Document,
|
||||
token: Token,
|
||||
matches: Captures,
|
||||
) -> Vec<Report<'_, (Rc<dyn Source>, Range<usize>)>> {
|
||||
let mut result = vec![];
|
||||
let link_name = match matches.get(1)
|
||||
{
|
||||
let link_name = match matches.get(1) {
|
||||
Some(name) => {
|
||||
if name.as_str().is_empty()
|
||||
{
|
||||
if name.as_str().is_empty() {
|
||||
result.push(
|
||||
Report::build(ReportKind::Error, token.source(), name.start())
|
||||
.with_message("Empty link name")
|
||||
.with_label(
|
||||
Label::new((token.source().clone(), name.range()))
|
||||
.with_message("Link name is empty")
|
||||
.with_color(parser.colors().error))
|
||||
.finish());
|
||||
.with_color(parser.colors().error),
|
||||
)
|
||||
.finish(),
|
||||
);
|
||||
return result;
|
||||
}
|
||||
// TODO: process into separate document...
|
||||
let text_content = util::process_text(document, name.as_str());
|
||||
|
||||
if text_content.as_str().is_empty()
|
||||
{
|
||||
if text_content.as_str().is_empty() {
|
||||
result.push(
|
||||
Report::build(ReportKind::Error, token.source(), name.start())
|
||||
.with_message("Empty link name")
|
||||
.with_label(
|
||||
Label::new((token.source(), name.range()))
|
||||
.with_message(format!("Link name is empty. Once processed, `{}` yields `{}`",
|
||||
.with_message(format!(
|
||||
"Link name is empty. Once processed, `{}` yields `{}`",
|
||||
name.as_str().fg(parser.colors().highlight),
|
||||
text_content.as_str().fg(parser.colors().highlight),
|
||||
))
|
||||
.with_color(parser.colors().error))
|
||||
.finish());
|
||||
.with_color(parser.colors().error),
|
||||
)
|
||||
.finish(),
|
||||
);
|
||||
return result;
|
||||
}
|
||||
text_content
|
||||
},
|
||||
}
|
||||
_ => panic!("Empty link name"),
|
||||
};
|
||||
|
||||
let link_url = match matches.get(2)
|
||||
{
|
||||
let link_url = match matches.get(2) {
|
||||
Some(url) => {
|
||||
if url.as_str().is_empty()
|
||||
{
|
||||
if url.as_str().is_empty() {
|
||||
result.push(
|
||||
Report::build(ReportKind::Error, token.source(), url.start())
|
||||
.with_message("Empty link url")
|
||||
.with_label(
|
||||
Label::new((token.source(), url.range()))
|
||||
.with_message("Link url is empty")
|
||||
.with_color(parser.colors().error))
|
||||
.finish());
|
||||
.with_color(parser.colors().error),
|
||||
)
|
||||
.finish(),
|
||||
);
|
||||
return result;
|
||||
}
|
||||
let text_content = util::process_text(document, url.as_str());
|
||||
|
||||
if text_content.as_str().is_empty()
|
||||
{
|
||||
if text_content.as_str().is_empty() {
|
||||
result.push(
|
||||
Report::build(ReportKind::Error, token.source(), url.start())
|
||||
.with_message("Empty link url")
|
||||
.with_label(
|
||||
Label::new((token.source(), url.range()))
|
||||
.with_message(format!("Link url is empty. Once processed, `{}` yields `{}`",
|
||||
.with_message(format!(
|
||||
"Link url is empty. Once processed, `{}` yields `{}`",
|
||||
url.as_str().fg(parser.colors().highlight),
|
||||
text_content.as_str().fg(parser.colors().highlight),
|
||||
))
|
||||
.with_color(parser.colors().error))
|
||||
.finish());
|
||||
.with_color(parser.colors().error),
|
||||
)
|
||||
.finish(),
|
||||
);
|
||||
return result;
|
||||
}
|
||||
text_content
|
||||
},
|
||||
}
|
||||
_ => panic!("Empty link url"),
|
||||
};
|
||||
|
||||
parser.push(document, Box::new(
|
||||
Link::new(
|
||||
token.clone(),
|
||||
link_name,
|
||||
link_url
|
||||
)
|
||||
));
|
||||
parser.push(
|
||||
document,
|
||||
Box::new(Link::new(token.clone(), link_name, link_url)),
|
||||
);
|
||||
|
||||
return result;
|
||||
}
|
||||
|
|
|
@ -1,6 +1,6 @@
|
|||
use std::{any::Any, cell::Ref, ops::Range, rc::Rc};
|
||||
|
||||
use crate::{compiler::compiler::{Compiler, Target}, document::{document::Document, element::{ElemKind, Element}}, parser::{parser::Parser, rule::Rule, source::{Cursor, Source, Token, VirtualSource}}};
|
||||
use crate::{compiler::compiler::{Compiler, Target}, document::{document::{Document, DocumentAccessors}, element::{ElemKind, Element}}, parser::{parser::Parser, rule::Rule, source::{Cursor, Source, Token, VirtualSource}}};
|
||||
use ariadne::{Label, Report, ReportKind};
|
||||
use mlua::{Function, Lua};
|
||||
use regex::Regex;
|
||||
|
@ -57,7 +57,7 @@ impl Element for List
|
|||
|
||||
fn to_string(&self) -> String { format!("{self:#?}") }
|
||||
|
||||
fn compile(&self, compiler: &Compiler, document: &Document) -> Result<String, String> {
|
||||
fn compile(&self, compiler: &Compiler, document: &dyn Document) -> Result<String, String> {
|
||||
match compiler.target()
|
||||
{
|
||||
Target::HTML => {
|
||||
|
@ -95,7 +95,7 @@ impl Element for List
|
|||
match_stack(&mut result, &ent.numbering);
|
||||
result.push_str("<li>");
|
||||
match ent.content.iter().enumerate()
|
||||
.try_for_each(|(idx, elem)| {
|
||||
.try_for_each(|(_idx, elem)| {
|
||||
match elem.compile(compiler, document) {
|
||||
Err(e) => Err(e),
|
||||
Ok(s) => { result.push_str(s.as_str()); Ok(()) }
|
||||
|
@ -196,10 +196,11 @@ impl ListRule {
|
|||
|
||||
}
|
||||
|
||||
fn parse_depth(depth: &str, document: &Document) -> Vec<(bool, usize)>
|
||||
fn parse_depth(depth: &str, document: &dyn Document) -> Vec<(bool, usize)>
|
||||
{
|
||||
let mut parsed = vec![];
|
||||
let prev_entry = document.last_element::<List>(true)
|
||||
// FIXME: Previous iteration used to recursively retrieve the list indent
|
||||
let prev_entry = document.last_element::<List>()
|
||||
.and_then(|list| Ref::filter_map(list, |m| m.entries.last() ).ok() )
|
||||
.and_then(|entry| Ref::filter_map(entry, |e| Some(&e.numbering)).ok() );
|
||||
|
||||
|
@ -246,7 +247,8 @@ impl Rule for ListRule
|
|||
|m| Some((m.start(), Box::new([false;0]) as Box<dyn Any>)) )
|
||||
}
|
||||
|
||||
fn on_match<'a>(&self, parser: &dyn Parser, document: &'a Document<'a>, cursor: Cursor, _match_data: Option<Box<dyn Any>>) -> (Cursor, Vec<Report<'_, (Rc<dyn Source>, Range<usize>)>>) {
|
||||
fn on_match<'a>(&self, parser: &dyn Parser, document: &'a dyn Document<'a>, cursor: Cursor, _match_data: Option<Box<dyn Any>>)
|
||||
-> (Cursor, Vec<Report<'_, (Rc<dyn Source>, Range<usize>)>>) {
|
||||
let mut reports = vec![];
|
||||
let content = cursor.source.content();
|
||||
let (end_cursor, numbering, source) = match self.start_re.captures_at(content, cursor.pos) {
|
||||
|
@ -310,8 +312,8 @@ impl Rule for ListRule
|
|||
},
|
||||
};
|
||||
|
||||
let parsed_entry = parser.parse(Rc::new(source), Some(&document));
|
||||
let mut parsed_paragraph = parsed_entry.last_element_mut::<Paragraph>(false).unwrap(); // Extract content from paragraph
|
||||
let parsed_entry = parser.parse(Rc::new(source), Some(document));
|
||||
let mut parsed_paragraph = parsed_entry.last_element_mut::<Paragraph>().unwrap(); // Extract content from paragraph
|
||||
let entry = ListEntry::new(
|
||||
Token::new(cursor.pos..end_cursor.pos, cursor.source.clone()),
|
||||
numbering,
|
||||
|
@ -319,14 +321,14 @@ impl Rule for ListRule
|
|||
);
|
||||
|
||||
// Ger previous list, if none insert a new list
|
||||
let mut list = match document.last_element_mut::<List>(false)
|
||||
let mut list = match document.last_element_mut::<List>()
|
||||
{
|
||||
Some(last) => last,
|
||||
None => {
|
||||
parser.push(document,
|
||||
Box::new(List::new(
|
||||
Token::new(cursor.pos..end_cursor.pos, cursor.source.clone()))));
|
||||
document.last_element_mut::<List>(false).unwrap()
|
||||
document.last_element_mut::<List>().unwrap()
|
||||
}
|
||||
};
|
||||
list.push(entry);
|
||||
|
|
462
src/elements/media.rs
Normal file
462
src/elements/media.rs
Normal file
|
@ -0,0 +1,462 @@
|
|||
use std::collections::HashMap;
|
||||
use std::ops::Range;
|
||||
use std::rc::Rc;
|
||||
use std::str::FromStr;
|
||||
|
||||
use ariadne::Fmt;
|
||||
use ariadne::Label;
|
||||
use ariadne::Report;
|
||||
use ariadne::ReportKind;
|
||||
use regex::Captures;
|
||||
use regex::Match;
|
||||
use regex::Regex;
|
||||
use regex::RegexBuilder;
|
||||
|
||||
use crate::compiler::compiler::Compiler;
|
||||
use crate::compiler::compiler::Target;
|
||||
use crate::document::document::Document;
|
||||
use crate::document::document::DocumentAccessors;
|
||||
use crate::document::element::ElemKind;
|
||||
use crate::document::element::Element;
|
||||
use crate::document::references::validate_refname;
|
||||
use crate::parser::parser::ReportColors;
|
||||
use crate::parser::rule::RegexRule;
|
||||
use crate::parser::source::Source;
|
||||
use crate::parser::source::Token;
|
||||
use crate::parser::source::VirtualSource;
|
||||
use crate::parser::util;
|
||||
use crate::parser::util::parse_paragraph;
|
||||
use crate::parser::util::Property;
|
||||
use crate::parser::util::PropertyMap;
|
||||
use crate::parser::util::PropertyMapError;
|
||||
use crate::parser::util::PropertyParser;
|
||||
|
||||
use super::paragraph::Paragraph;
|
||||
|
||||
#[derive(Debug)]
|
||||
pub enum MediaType {
|
||||
IMAGE,
|
||||
VIDEO,
|
||||
AUDIO,
|
||||
}
|
||||
|
||||
impl FromStr for MediaType {
|
||||
type Err = String;
|
||||
|
||||
fn from_str(s: &str) -> Result<Self, Self::Err> {
|
||||
match s {
|
||||
"image" => Ok(MediaType::IMAGE),
|
||||
"video" => Ok(MediaType::VIDEO),
|
||||
"audio" => Ok(MediaType::AUDIO),
|
||||
_ => Err(format!("Unknown media type: {s}")),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Debug)]
|
||||
struct MediaGroup {
|
||||
pub(self) location: Token,
|
||||
pub(self) media: Vec<Media>,
|
||||
}
|
||||
|
||||
impl MediaGroup {
|
||||
fn push(&mut self, media: Media) -> Result<(), String> {
|
||||
if self.location.source() != media.location.source() {
|
||||
return Err(format!(
|
||||
"Attempted to insert media from {} into MediaGroup from {}",
|
||||
self.location.source(),
|
||||
media.location.source()
|
||||
));
|
||||
}
|
||||
|
||||
self.location.range = self.location.start()..media.location.end();
|
||||
self.media.push(media);
|
||||
Ok(())
|
||||
}
|
||||
}
|
||||
|
||||
impl Element for MediaGroup {
|
||||
fn location(&self) -> &Token { &self.location }
|
||||
|
||||
fn kind(&self) -> ElemKind { ElemKind::Block }
|
||||
|
||||
fn element_name(&self) -> &'static str { "Media Group" }
|
||||
|
||||
fn to_string(&self) -> String { format!("{self:#?}") }
|
||||
|
||||
fn compile(&self, compiler: &Compiler, document: &dyn Document) -> Result<String, String> {
|
||||
match compiler.target() {
|
||||
Target::HTML => {
|
||||
let mut result = String::new();
|
||||
|
||||
result.push_str("<div class=\"media\">");
|
||||
for medium in &self.media {
|
||||
match medium.compile(compiler, document) {
|
||||
Ok(r) => result.push_str(r.as_str()),
|
||||
Err(e) => return Err(e),
|
||||
}
|
||||
}
|
||||
result.push_str("</div>");
|
||||
|
||||
Ok(result)
|
||||
}
|
||||
_ => todo!(""),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Debug)]
|
||||
struct Media {
|
||||
pub(self) location: Token,
|
||||
pub(self) reference: String,
|
||||
pub(self) uri: String,
|
||||
pub(self) media_type: MediaType,
|
||||
pub(self) width: Option<String>,
|
||||
pub(self) caption: Option<String>,
|
||||
pub(self) description: Option<Paragraph>,
|
||||
}
|
||||
|
||||
impl Element for Media {
|
||||
fn location(&self) -> &Token { &self.location }
|
||||
|
||||
fn kind(&self) -> ElemKind { ElemKind::Block }
|
||||
|
||||
fn element_name(&self) -> &'static str { "Media" }
|
||||
|
||||
fn to_string(&self) -> String { format!("{self:#?}") }
|
||||
|
||||
fn compile(&self, compiler: &Compiler, document: &dyn Document) -> Result<String, String> {
|
||||
match compiler.target() {
|
||||
Target::HTML => {
|
||||
let mut result = String::new();
|
||||
|
||||
let width = self
|
||||
.width
|
||||
.as_ref()
|
||||
.map_or(String::new(), |w| format!(r#" style="width:{w};""#));
|
||||
result.push_str(format!(r#"<div class="medium" {width}>"#).as_str());
|
||||
match self.media_type {
|
||||
MediaType::IMAGE => result.push_str(
|
||||
format!(r#"<a href="{0}"><img src="{0}"></a>"#, self.uri).as_str(),
|
||||
),
|
||||
MediaType::VIDEO => todo!(),
|
||||
MediaType::AUDIO => todo!(),
|
||||
}
|
||||
result.push_str(format!(r#"<p class="medium-refname">{}</p>"#, "TODO").as_str());
|
||||
if let Some(paragraph) = self.description.as_ref() {
|
||||
match paragraph.compile(compiler, document) {
|
||||
Ok(res) => result.push_str(res.as_str()),
|
||||
Err(err) => return Err(err),
|
||||
}
|
||||
}
|
||||
result.push_str("</div>");
|
||||
|
||||
Ok(result)
|
||||
}
|
||||
_ => todo!(""),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
pub struct MediaRule {
|
||||
re: [Regex; 1],
|
||||
properties: PropertyParser,
|
||||
}
|
||||
|
||||
impl MediaRule {
|
||||
pub fn new() -> Self {
|
||||
let mut props = HashMap::new();
|
||||
props.insert(
|
||||
"type".to_string(),
|
||||
Property::new(
|
||||
false,
|
||||
"Override for the media type detection".to_string(),
|
||||
None,
|
||||
),
|
||||
);
|
||||
props.insert(
|
||||
"width".to_string(),
|
||||
Property::new(false, "Override for the media width".to_string(), None),
|
||||
);
|
||||
Self {
|
||||
re: [RegexBuilder::new(
|
||||
r"^!\[(.*)\]\(((?:\\.|[^\\\\])*?)\)(?:\[((?:\\.|[^\\\\])*?)\])?((?:\\(?:.|\n)|[^\\\\])*?$)?",
|
||||
)
|
||||
.multi_line(true)
|
||||
.build()
|
||||
.unwrap()],
|
||||
properties: PropertyParser::new(props),
|
||||
}
|
||||
}
|
||||
|
||||
fn validate_uri(uri: &str) -> Result<&str, String> {
|
||||
let trimmed = uri.trim_start().trim_end();
|
||||
|
||||
if trimmed.is_empty() {
|
||||
return Err("URIs is empty".to_string());
|
||||
}
|
||||
|
||||
Ok(trimmed)
|
||||
}
|
||||
|
||||
fn parse_properties(
|
||||
&self,
|
||||
colors: &ReportColors,
|
||||
token: &Token,
|
||||
m: &Option<Match>,
|
||||
) -> Result<PropertyMap, Report<'_, (Rc<dyn Source>, Range<usize>)>> {
|
||||
match m {
|
||||
None => match self.properties.default() {
|
||||
Ok(properties) => Ok(properties),
|
||||
Err(e) => Err(
|
||||
Report::build(ReportKind::Error, token.source(), token.start())
|
||||
.with_message("Invalid Media Properties")
|
||||
.with_label(
|
||||
Label::new((token.source().clone(), token.range.clone()))
|
||||
.with_message(format!("Media is missing required property: {e}"))
|
||||
.with_color(colors.error),
|
||||
)
|
||||
.finish(),
|
||||
),
|
||||
},
|
||||
Some(props) => {
|
||||
let processed =
|
||||
util::process_escaped('\\', "]", props.as_str().trim_start().trim_end());
|
||||
match self.properties.parse(processed.as_str()) {
|
||||
Err(e) => Err(
|
||||
Report::build(ReportKind::Error, token.source(), props.start())
|
||||
.with_message("Invalid Media Properties")
|
||||
.with_label(
|
||||
Label::new((token.source().clone(), props.range()))
|
||||
.with_message(e)
|
||||
.with_color(colors.error),
|
||||
)
|
||||
.finish(),
|
||||
),
|
||||
Ok(properties) => Ok(properties),
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
fn detect_filetype(filename: &str) -> Option<MediaType> {
|
||||
let sep = match filename.rfind('.') {
|
||||
Some(pos) => pos,
|
||||
None => return None,
|
||||
};
|
||||
|
||||
// TODO: https://developer.mozilla.org/en-US/docs/Web/Media/Formats/Containers
|
||||
match filename.split_at(sep + 1).1.to_ascii_lowercase().as_str() {
|
||||
"png" | "apng" | "avif" | "gif" | "webp" | "svg" | "bmp" | "jpg" | "jpeg" | "jfif"
|
||||
| "pjpeg" | "pjp" => Some(MediaType::IMAGE),
|
||||
_ => None,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl RegexRule for MediaRule {
|
||||
fn name(&self) -> &'static str { "Media" }
|
||||
|
||||
fn regexes(&self) -> &[regex::Regex] { &self.re }
|
||||
|
||||
fn on_regex_match<'a>(
|
||||
&self,
|
||||
_: usize,
|
||||
parser: &dyn crate::parser::parser::Parser,
|
||||
document: &'a (dyn Document<'a> + 'a),
|
||||
token: Token,
|
||||
matches: Captures,
|
||||
) -> Vec<Report<'_, (Rc<dyn Source>, Range<usize>)>> {
|
||||
let mut reports = vec![];
|
||||
|
||||
let refname = match (
|
||||
matches.get(1).unwrap(),
|
||||
validate_refname(matches.get(1).unwrap().as_str()),
|
||||
) {
|
||||
(_, Ok(refname)) => refname.to_string(),
|
||||
(m, Err(err)) => {
|
||||
reports.push(
|
||||
Report::build(ReportKind::Error, token.source(), m.start())
|
||||
.with_message("Invalid Media Refname")
|
||||
.with_label(
|
||||
Label::new((token.source().clone(), m.range())).with_message(err),
|
||||
)
|
||||
.finish(),
|
||||
);
|
||||
return reports;
|
||||
}
|
||||
};
|
||||
|
||||
let uri = match (
|
||||
matches.get(2).unwrap(),
|
||||
MediaRule::validate_uri(matches.get(2).unwrap().as_str()),
|
||||
) {
|
||||
(_, Ok(uri)) => uri.to_string(),
|
||||
(m, Err(err)) => {
|
||||
reports.push(
|
||||
Report::build(ReportKind::Error, token.source(), m.start())
|
||||
.with_message("Invalid Media URI")
|
||||
.with_label(
|
||||
Label::new((token.source().clone(), m.range())).with_message(err),
|
||||
)
|
||||
.finish(),
|
||||
);
|
||||
return reports;
|
||||
}
|
||||
};
|
||||
|
||||
// Properties
|
||||
let properties = match self.parse_properties(parser.colors(), &token, &matches.get(3)) {
|
||||
Ok(pm) => pm,
|
||||
Err(report) => {
|
||||
reports.push(report);
|
||||
return reports;
|
||||
}
|
||||
};
|
||||
|
||||
let media_type =
|
||||
match Self::detect_filetype(uri.as_str()) {
|
||||
Some(media_type) => media_type,
|
||||
None => match properties.get("type", |prop, value| {
|
||||
MediaType::from_str(value.as_str()).map_err(|e| (prop, e))
|
||||
}) {
|
||||
Ok((_prop, kind)) => kind,
|
||||
Err(e) => match e {
|
||||
PropertyMapError::ParseError((prop, err)) => {
|
||||
reports.push(
|
||||
Report::build(ReportKind::Error, token.source(), token.start())
|
||||
.with_message("Invalid Media Property")
|
||||
.with_label(
|
||||
Label::new((token.source().clone(), token.range.clone()))
|
||||
.with_message(format!(
|
||||
"Property `type: {}` cannot be converted: {}",
|
||||
prop.fg(parser.colors().info),
|
||||
err.fg(parser.colors().error)
|
||||
))
|
||||
.with_color(parser.colors().warning),
|
||||
)
|
||||
.finish(),
|
||||
);
|
||||
return reports;
|
||||
}
|
||||
PropertyMapError::NotFoundError(err) => {
|
||||
reports.push(
|
||||
Report::build(ReportKind::Error, token.source(), token.start())
|
||||
.with_message("Invalid Media Property")
|
||||
.with_label(
|
||||
Label::new((
|
||||
token.source().clone(),
|
||||
token.start() + 1..token.end(),
|
||||
))
|
||||
.with_message(format!("{err}. Required because mediatype could not be detected"))
|
||||
.with_color(parser.colors().error),
|
||||
)
|
||||
.finish(),
|
||||
);
|
||||
return reports;
|
||||
}
|
||||
},
|
||||
},
|
||||
};
|
||||
|
||||
let width = properties
|
||||
.get("width", |_, value| -> Result<String, ()> {
|
||||
Ok(value.clone())
|
||||
})
|
||||
.ok()
|
||||
.and_then(|(_, s)| Some(s));
|
||||
|
||||
let description = match matches.get(4) {
|
||||
Some(content) => {
|
||||
let source = Rc::new(VirtualSource::new(
|
||||
Token::new(content.range(), token.source()),
|
||||
format!("Media[{refname}] description"),
|
||||
content.as_str().trim_start().trim_end().to_string(),
|
||||
));
|
||||
if source.content().is_empty() {
|
||||
None
|
||||
} else {
|
||||
match parse_paragraph(parser, source, document) {
|
||||
Ok(paragraph) => Some(*paragraph),
|
||||
Err(err) => {
|
||||
reports.push(
|
||||
Report::build(ReportKind::Error, token.source(), content.start())
|
||||
.with_message("Invalid Media Description")
|
||||
.with_label(
|
||||
Label::new((token.source().clone(), content.range()))
|
||||
.with_message(format!(
|
||||
"Could not parse description: {err}"
|
||||
))
|
||||
.with_color(parser.colors().error),
|
||||
)
|
||||
.finish(),
|
||||
);
|
||||
return reports;
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
None => panic!("Unknown error"),
|
||||
};
|
||||
|
||||
// TODO: caption
|
||||
let mut group = match document.last_element_mut::<MediaGroup>() {
|
||||
Some(group) => group,
|
||||
None => {
|
||||
parser.push(
|
||||
document,
|
||||
Box::new(MediaGroup {
|
||||
location: token.clone(),
|
||||
media: vec![],
|
||||
}),
|
||||
);
|
||||
|
||||
document.last_element_mut::<MediaGroup>().unwrap()
|
||||
}
|
||||
};
|
||||
|
||||
if let Err(err) = group.push(Media {
|
||||
location: token.clone(),
|
||||
reference: refname,
|
||||
uri,
|
||||
media_type,
|
||||
width,
|
||||
caption: None,
|
||||
description,
|
||||
}) {
|
||||
reports.push(
|
||||
Report::build(ReportKind::Error, token.source(), token.start())
|
||||
.with_message("Invalid Media")
|
||||
.with_label(
|
||||
Label::new((token.source().clone(), token.range.clone()))
|
||||
.with_message(err)
|
||||
.with_color(parser.colors().error),
|
||||
)
|
||||
.finish(),
|
||||
);
|
||||
}
|
||||
|
||||
reports
|
||||
}
|
||||
|
||||
fn lua_bindings<'lua>(&self, _lua: &'lua mlua::Lua) -> Vec<(String, mlua::Function<'lua>)> {
|
||||
vec![]
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn regex() {
|
||||
let rule = MediaRule::new();
|
||||
let re = &rule.regexes()[0];
|
||||
|
||||
assert!(re.is_match("![refname](some path...)[some properties] some description"));
|
||||
assert!(re.is_match(
|
||||
r"![refname](some p\)ath...\\)[some propert\]ies\\\\] some description\\nanother line"
|
||||
));
|
||||
assert!(re.is_match_at("![r1](uri1)[props1] desc1\n![r2](uri2)[props2] desc2", 26));
|
||||
}
|
||||
}
|
|
@ -11,4 +11,6 @@ pub mod section;
|
|||
pub mod link;
|
||||
pub mod code;
|
||||
pub mod tex;
|
||||
pub mod graphviz;
|
||||
pub mod raw;
|
||||
pub mod media;
|
||||
|
|
|
@ -1,10 +1,22 @@
|
|||
use std::{any::Any, ops::Range, rc::Rc};
|
||||
use std::any::Any;
|
||||
use std::ops::Range;
|
||||
use std::rc::Rc;
|
||||
|
||||
use ariadne::Report;
|
||||
use mlua::{Function, Lua};
|
||||
use mlua::Function;
|
||||
use mlua::Lua;
|
||||
use regex::Regex;
|
||||
|
||||
use crate::{compiler::compiler::{Compiler, Target}, document::{document::Document, element::{ElemKind, Element}}, parser::{parser::Parser, rule::Rule, source::{Cursor, Source, Token}}};
|
||||
use crate::compiler::compiler::Compiler;
|
||||
use crate::compiler::compiler::Target;
|
||||
use crate::document::document::Document;
|
||||
use crate::document::element::ElemKind;
|
||||
use crate::document::element::Element;
|
||||
use crate::parser::parser::Parser;
|
||||
use crate::parser::rule::Rule;
|
||||
use crate::parser::source::Cursor;
|
||||
use crate::parser::source::Source;
|
||||
use crate::parser::source::Token;
|
||||
|
||||
// TODO: Full refactor
|
||||
// Problem is that document parsed from other sources i.e by variables
|
||||
|
@ -14,38 +26,37 @@ use crate::{compiler::compiler::{Compiler, Target}, document::{document::Documen
|
|||
// The issue is that this would break the current `Token` implementation
|
||||
// Which would need to be reworked
|
||||
#[derive(Debug)]
|
||||
pub struct Paragraph
|
||||
{
|
||||
pub struct Paragraph {
|
||||
location: Token,
|
||||
pub content: Vec<Box<dyn Element>>
|
||||
pub content: Vec<Box<dyn Element>>,
|
||||
}
|
||||
|
||||
impl Paragraph
|
||||
{
|
||||
impl Paragraph {
|
||||
pub fn new(location: Token) -> Self {
|
||||
Self { location, content: Vec::new() }
|
||||
Self {
|
||||
location,
|
||||
content: Vec::new(),
|
||||
}
|
||||
}
|
||||
|
||||
pub fn is_empty(&self) -> bool { self.content.is_empty() }
|
||||
|
||||
pub fn push(&mut self, elem: Box<dyn Element>)
|
||||
{
|
||||
if elem.location().source() == self.location().source()
|
||||
{
|
||||
self.location.range = self.location.start() .. elem.location().end();
|
||||
pub fn push(&mut self, elem: Box<dyn Element>) {
|
||||
if elem.location().source() == self.location().source() {
|
||||
self.location.range = self.location.start()..elem.location().end();
|
||||
}
|
||||
self.content.push(elem);
|
||||
}
|
||||
|
||||
pub fn find_back<P: FnMut(&&Box<dyn Element + 'static>) -> bool>(&self, mut predicate: P)
|
||||
-> Option<&Box<dyn Element>> {
|
||||
self.content.iter().rev()
|
||||
.find(predicate)
|
||||
pub fn find_back<P: FnMut(&&Box<dyn Element + 'static>) -> bool>(
|
||||
&self,
|
||||
predicate: P,
|
||||
) -> Option<&Box<dyn Element>> {
|
||||
self.content.iter().rev().find(predicate)
|
||||
}
|
||||
}
|
||||
|
||||
impl Element for Paragraph
|
||||
{
|
||||
impl Element for Paragraph {
|
||||
fn location(&self) -> &Token { &self.location }
|
||||
|
||||
fn kind(&self) -> ElemKind { ElemKind::Special }
|
||||
|
@ -54,74 +65,85 @@ impl Element for Paragraph
|
|||
|
||||
fn to_string(&self) -> String { format!("{:#?}", self) }
|
||||
|
||||
fn compile(&self, compiler: &Compiler, document: &Document) -> Result<String, String> {
|
||||
if self.content.is_empty() { return Ok(String::new()) }
|
||||
fn compile(&self, compiler: &Compiler, document: &dyn Document) -> Result<String, String> {
|
||||
if self.content.is_empty() {
|
||||
return Ok(String::new());
|
||||
}
|
||||
|
||||
match compiler.target()
|
||||
{
|
||||
match compiler.target() {
|
||||
Target::HTML => {
|
||||
let mut result = String::new();
|
||||
//if prev.is_none() || prev.unwrap().downcast_ref::<Paragraph>().is_none()
|
||||
{ result.push_str("<p>"); }
|
||||
{
|
||||
result.push_str("<p>");
|
||||
}
|
||||
//else
|
||||
//{ result.push_str(" "); }
|
||||
|
||||
let err = self.content.iter().try_for_each(|elem| {
|
||||
match elem.compile(compiler, document)
|
||||
{
|
||||
match elem.compile(compiler, document) {
|
||||
Err(e) => return Err(e),
|
||||
Ok(content) => { result.push_str(content.as_str()); Ok(()) },
|
||||
Ok(content) => {
|
||||
result.push_str(content.as_str());
|
||||
Ok(())
|
||||
}
|
||||
}
|
||||
});
|
||||
//if next.is_none() || next.unwrap().downcast_ref::<Paragraph>().is_none()
|
||||
{ result.push_str("</p>"); }
|
||||
|
||||
match err
|
||||
{
|
||||
result.push_str("</p>");
|
||||
}
|
||||
|
||||
match err {
|
||||
Err(e) => Err(e),
|
||||
Ok(()) => Ok(result),
|
||||
}
|
||||
}
|
||||
Target::LATEX => todo!("Unimplemented compiler")
|
||||
Target::LATEX => todo!("Unimplemented compiler"),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
pub struct ParagraphRule
|
||||
{
|
||||
pub struct ParagraphRule {
|
||||
re: Regex,
|
||||
}
|
||||
|
||||
impl ParagraphRule {
|
||||
pub fn new() -> Self {
|
||||
Self {
|
||||
re: Regex::new(r"\n{2,}").unwrap()
|
||||
re: Regex::new(r"\n{2,}").unwrap(),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl Rule for ParagraphRule
|
||||
{
|
||||
impl Rule for ParagraphRule {
|
||||
fn name(&self) -> &'static str { "Paragraphing" }
|
||||
|
||||
fn next_match(&self, cursor: &Cursor) -> Option<(usize, Box<dyn Any>)> {
|
||||
self.re.find_at(cursor.source.content(), cursor.pos)
|
||||
.and_then(|m| Some((m.start(), Box::new([false;0]) as Box<dyn Any>)) )
|
||||
self.re
|
||||
.find_at(cursor.source.content(), cursor.pos)
|
||||
.and_then(|m| Some((m.start(), Box::new([false; 0]) as Box<dyn Any>)))
|
||||
}
|
||||
|
||||
fn on_match(&self, parser: &dyn Parser, document: &Document, cursor: Cursor, _match_data: Option<Box<dyn Any>>)
|
||||
-> (Cursor, Vec<Report<'_, (Rc<dyn Source>, Range<usize>)>>) {
|
||||
|
||||
let end_cursor = match self.re.captures_at(cursor.source.content(), cursor.pos)
|
||||
{
|
||||
fn on_match(
|
||||
&self,
|
||||
parser: &dyn Parser,
|
||||
document: &dyn Document,
|
||||
cursor: Cursor,
|
||||
_match_data: Option<Box<dyn Any>>,
|
||||
) -> (Cursor, Vec<Report<'_, (Rc<dyn Source>, Range<usize>)>>) {
|
||||
let end_cursor = match self.re.captures_at(cursor.source.content(), cursor.pos) {
|
||||
None => panic!("Unknown error"),
|
||||
Some(capture) =>
|
||||
cursor.at(capture.get(0).unwrap().end()-1)
|
||||
Some(capture) => cursor.at(capture.get(0).unwrap().end() - 1),
|
||||
};
|
||||
|
||||
parser.push(document, Box::new(Paragraph::new(
|
||||
Token::new(cursor.pos..end_cursor.pos, cursor.source.clone())
|
||||
)));
|
||||
parser.push(
|
||||
document,
|
||||
Box::new(Paragraph::new(Token::new(
|
||||
cursor.pos..end_cursor.pos,
|
||||
cursor.source.clone(),
|
||||
))),
|
||||
);
|
||||
|
||||
(end_cursor, Vec::new())
|
||||
}
|
||||
|
|
|
@ -1,20 +1,45 @@
|
|||
use mlua::{Function, Lua};
|
||||
use regex::{Captures, Regex};
|
||||
use crate::{compiler::compiler::Compiler, document::element::{ElemKind, Element}, parser::{parser::Parser, rule::RegexRule, source::{Source, Token}, util::{self, Property, PropertyParser}}};
|
||||
use ariadne::{Fmt, Label, Report, ReportKind};
|
||||
use crate::compiler::compiler::Compiler;
|
||||
use crate::document::document::Document;
|
||||
use std::{collections::HashMap, ops::Range, rc::Rc, str::FromStr};
|
||||
use crate::document::element::ElemKind;
|
||||
use crate::document::element::Element;
|
||||
use crate::lua::kernel::CTX;
|
||||
use crate::parser::parser::Parser;
|
||||
use crate::parser::rule::RegexRule;
|
||||
use crate::parser::source::Source;
|
||||
use crate::parser::source::Token;
|
||||
use crate::parser::util::Property;
|
||||
use crate::parser::util::PropertyMapError;
|
||||
use crate::parser::util::PropertyParser;
|
||||
use crate::parser::util::{self};
|
||||
use ariadne::Fmt;
|
||||
use ariadne::Label;
|
||||
use ariadne::Report;
|
||||
use ariadne::ReportKind;
|
||||
use mlua::Error::BadArgument;
|
||||
use mlua::Function;
|
||||
use mlua::Lua;
|
||||
use regex::Captures;
|
||||
use regex::Regex;
|
||||
use std::collections::HashMap;
|
||||
use std::ops::Range;
|
||||
use std::rc::Rc;
|
||||
use std::str::FromStr;
|
||||
use std::sync::Arc;
|
||||
|
||||
#[derive(Debug)]
|
||||
struct Raw {
|
||||
location: Token,
|
||||
kind: ElemKind,
|
||||
content: String,
|
||||
pub(self) location: Token,
|
||||
pub(self) kind: ElemKind,
|
||||
pub(self) content: String,
|
||||
}
|
||||
|
||||
impl Raw {
|
||||
fn new(location: Token, kind: ElemKind, content: String) -> Self {
|
||||
Self { location, kind, content }
|
||||
Self {
|
||||
location,
|
||||
kind,
|
||||
content,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
|
@ -26,7 +51,7 @@ impl Element for Raw {
|
|||
|
||||
fn to_string(&self) -> String { format!("{self:#?}") }
|
||||
|
||||
fn compile(&self, compiler: &Compiler, _document: &Document) -> Result<String, String> {
|
||||
fn compile(&self, _compiler: &Compiler, _document: &dyn Document) -> Result<String, String> {
|
||||
Ok(self.content.clone())
|
||||
}
|
||||
}
|
||||
|
@ -39,32 +64,40 @@ pub struct RawRule {
|
|||
impl RawRule {
|
||||
pub fn new() -> Self {
|
||||
let mut props = HashMap::new();
|
||||
props.insert("kind".to_string(),
|
||||
props.insert(
|
||||
"kind".to_string(),
|
||||
Property::new(
|
||||
true,
|
||||
"Element display kind".to_string(),
|
||||
Some("inline".to_string())));
|
||||
Some("inline".to_string()),
|
||||
),
|
||||
);
|
||||
Self {
|
||||
re: [
|
||||
Regex::new(r"\{\?(?:\[((?:\\.|[^\[\]\\])*?)\])?(?:((?:\\.|[^\\\\])*?)(\?\}))?").unwrap()
|
||||
Regex::new(r"\{\?(?:\[((?:\\.|[^\[\]\\])*?)\])?(?:((?:\\.|[^\\\\])*?)(\?\}))?")
|
||||
.unwrap(),
|
||||
],
|
||||
properties: PropertyParser::new(props)
|
||||
properties: PropertyParser::new(props),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl RegexRule for RawRule
|
||||
{
|
||||
impl RegexRule for RawRule {
|
||||
fn name(&self) -> &'static str { "Raw" }
|
||||
|
||||
fn regexes(&self) -> &[regex::Regex] { &self.re }
|
||||
|
||||
fn on_regex_match(&self, _index: usize, parser: &dyn Parser, document: &Document, token: Token, matches: Captures)
|
||||
-> Vec<Report<'_, (Rc<dyn Source>, Range<usize>)>> {
|
||||
fn on_regex_match(
|
||||
&self,
|
||||
_index: usize,
|
||||
parser: &dyn Parser,
|
||||
document: &dyn Document,
|
||||
token: Token,
|
||||
matches: Captures,
|
||||
) -> Vec<Report<'_, (Rc<dyn Source>, Range<usize>)>> {
|
||||
let mut reports = vec![];
|
||||
|
||||
let raw_content = match matches.get(2)
|
||||
{
|
||||
let raw_content = match matches.get(2) {
|
||||
// Unterminated
|
||||
None => {
|
||||
reports.push(
|
||||
|
@ -72,34 +105,38 @@ impl RegexRule for RawRule
|
|||
.with_message("Unterminated Raw Code")
|
||||
.with_label(
|
||||
Label::new((token.source().clone(), token.range.clone()))
|
||||
.with_message(format!("Missing terminating `{}` after first `{}`",
|
||||
.with_message(format!(
|
||||
"Missing terminating `{}` after first `{}`",
|
||||
"?}".fg(parser.colors().info),
|
||||
"{?".fg(parser.colors().info)))
|
||||
.with_color(parser.colors().error))
|
||||
.finish());
|
||||
"{?".fg(parser.colors().info)
|
||||
))
|
||||
.with_color(parser.colors().error),
|
||||
)
|
||||
.finish(),
|
||||
);
|
||||
return reports;
|
||||
}
|
||||
Some(content) => {
|
||||
let processed = util::process_escaped('\\', "?}",
|
||||
content.as_str().trim_start().trim_end());
|
||||
let processed =
|
||||
util::process_escaped('\\', "?}", content.as_str().trim_start().trim_end());
|
||||
|
||||
if processed.is_empty()
|
||||
{
|
||||
if processed.is_empty() {
|
||||
reports.push(
|
||||
Report::build(ReportKind::Warning, token.source(), content.start())
|
||||
.with_message("Empty Raw Code")
|
||||
.with_label(
|
||||
Label::new((token.source().clone(), content.range()))
|
||||
.with_message("Raw code is empty")
|
||||
.with_color(parser.colors().warning))
|
||||
.finish());
|
||||
.with_color(parser.colors().warning),
|
||||
)
|
||||
.finish(),
|
||||
);
|
||||
}
|
||||
processed
|
||||
}
|
||||
};
|
||||
|
||||
let properties = match matches.get(1)
|
||||
{
|
||||
let properties = match matches.get(1) {
|
||||
None => match self.properties.default() {
|
||||
Ok(properties) => properties,
|
||||
Err(e) => {
|
||||
|
@ -109,16 +146,17 @@ impl RegexRule for RawRule
|
|||
.with_label(
|
||||
Label::new((token.source().clone(), token.range.clone()))
|
||||
.with_message(format!("Raw code is missing properties: {e}"))
|
||||
.with_color(parser.colors().error))
|
||||
.finish());
|
||||
.with_color(parser.colors().error),
|
||||
)
|
||||
.finish(),
|
||||
);
|
||||
return reports;
|
||||
},
|
||||
}
|
||||
},
|
||||
Some(props) => {
|
||||
let processed = util::process_escaped('\\', "]",
|
||||
props.as_str().trim_start().trim_end());
|
||||
match self.properties.parse(processed.as_str())
|
||||
{
|
||||
let processed =
|
||||
util::process_escaped('\\', "]", props.as_str().trim_start().trim_end());
|
||||
match self.properties.parse(processed.as_str()) {
|
||||
Err(e) => {
|
||||
reports.push(
|
||||
Report::build(ReportKind::Error, token.source(), props.start())
|
||||
|
@ -126,43 +164,112 @@ impl RegexRule for RawRule
|
|||
.with_label(
|
||||
Label::new((token.source().clone(), props.range()))
|
||||
.with_message(e)
|
||||
.with_color(parser.colors().error))
|
||||
.finish());
|
||||
.with_color(parser.colors().error),
|
||||
)
|
||||
.finish(),
|
||||
);
|
||||
return reports;
|
||||
}
|
||||
Ok(properties) => properties
|
||||
Ok(properties) => properties,
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
let raw_kind : ElemKind = match properties.get("kind",
|
||||
|prop, value| ElemKind::from_str(value.as_str()).map_err(|e| (prop, e)))
|
||||
{
|
||||
let raw_kind: ElemKind = match properties.get("kind", |prop, value| {
|
||||
ElemKind::from_str(value.as_str()).map_err(|e| (prop, e))
|
||||
}) {
|
||||
Ok((_prop, kind)) => kind,
|
||||
Err((prop, e)) => {
|
||||
Err(e) => match e {
|
||||
PropertyMapError::ParseError((prop, err)) => {
|
||||
reports.push(
|
||||
Report::build(ReportKind::Error, token.source(), token.start())
|
||||
.with_message("Invalid Raw Code Property")
|
||||
.with_label(
|
||||
Label::new((token.source().clone(), token.range.clone()))
|
||||
.with_message(format!("Property `kind: {}` cannot be converted: {}",
|
||||
.with_message(format!(
|
||||
"Property `kind: {}` cannot be converted: {}",
|
||||
prop.fg(parser.colors().info),
|
||||
e.fg(parser.colors().error)))
|
||||
.with_color(parser.colors().warning))
|
||||
.finish());
|
||||
err.fg(parser.colors().error)
|
||||
))
|
||||
.with_color(parser.colors().warning),
|
||||
)
|
||||
.finish(),
|
||||
);
|
||||
return reports;
|
||||
}
|
||||
PropertyMapError::NotFoundError(err) => {
|
||||
reports.push(
|
||||
Report::build(ReportKind::Error, token.source(), token.start())
|
||||
.with_message("Invalid Code Property")
|
||||
.with_label(
|
||||
Label::new((
|
||||
token.source().clone(),
|
||||
token.start() + 1..token.end(),
|
||||
))
|
||||
.with_message(format!(
|
||||
"Property `{}` is missing",
|
||||
err.fg(parser.colors().info)
|
||||
))
|
||||
.with_color(parser.colors().warning),
|
||||
)
|
||||
.finish(),
|
||||
);
|
||||
return reports;
|
||||
}
|
||||
},
|
||||
};
|
||||
|
||||
parser.push(document, Box::new(Raw::new(
|
||||
token.clone(),
|
||||
raw_kind,
|
||||
raw_content
|
||||
)));
|
||||
parser.push(
|
||||
document,
|
||||
Box::new(Raw {
|
||||
location: token.clone(),
|
||||
kind: raw_kind,
|
||||
content: raw_content,
|
||||
}),
|
||||
);
|
||||
|
||||
reports
|
||||
}
|
||||
|
||||
// TODO
|
||||
fn lua_bindings<'lua>(&self, _lua: &'lua Lua) -> Vec<(String, Function<'lua>)> { vec![] }
|
||||
fn lua_bindings<'lua>(&self, lua: &'lua Lua) -> Vec<(String, Function<'lua>)> {
|
||||
let mut bindings = vec![];
|
||||
|
||||
bindings.push((
|
||||
"push".to_string(),
|
||||
lua.create_function(|_, (kind, content): (String, String)| {
|
||||
// Validate kind
|
||||
let kind = match ElemKind::from_str(kind.as_str()) {
|
||||
Ok(kind) => kind,
|
||||
Err(e) => {
|
||||
return Err(BadArgument {
|
||||
to: Some("push".to_string()),
|
||||
pos: 1,
|
||||
name: Some("kind".to_string()),
|
||||
cause: Arc::new(mlua::Error::external(format!(
|
||||
"Wrong section kind specified: {e}"
|
||||
))),
|
||||
})
|
||||
}
|
||||
};
|
||||
|
||||
CTX.with_borrow(|ctx| {
|
||||
ctx.as_ref().map(|ctx| {
|
||||
ctx.parser.push(
|
||||
ctx.document,
|
||||
Box::new(Raw {
|
||||
location: ctx.location.clone(),
|
||||
kind,
|
||||
content,
|
||||
}),
|
||||
);
|
||||
})
|
||||
});
|
||||
|
||||
Ok(())
|
||||
})
|
||||
.unwrap(),
|
||||
));
|
||||
|
||||
bindings
|
||||
}
|
||||
}
|
||||
|
|
|
@ -1,23 +1,38 @@
|
|||
use crate::parser::parser::Parser;
|
||||
|
||||
use super::{code::CodeRule, comment::CommentRule, import::ImportRule, link::LinkRule, list::ListRule, paragraph::ParagraphRule, raw::RawRule, script::ScriptRule, section::SectionRule, style::StyleRule, tex::TexRule, text::TextRule, variable::{VariableRule, VariableSubstitutionRule}};
|
||||
use super::code::CodeRule;
|
||||
use super::comment::CommentRule;
|
||||
use super::graphviz::GraphRule;
|
||||
use super::import::ImportRule;
|
||||
use super::link::LinkRule;
|
||||
use super::list::ListRule;
|
||||
use super::media::MediaRule;
|
||||
use super::paragraph::ParagraphRule;
|
||||
use super::raw::RawRule;
|
||||
use super::script::ScriptRule;
|
||||
use super::section::SectionRule;
|
||||
use super::style::StyleRule;
|
||||
use super::tex::TexRule;
|
||||
use super::text::TextRule;
|
||||
use super::variable::VariableRule;
|
||||
use super::variable::VariableSubstitutionRule;
|
||||
|
||||
pub fn register<P: Parser>(parser: &mut P) {
|
||||
parser.add_rule(Box::new(CommentRule::new()), None).unwrap();
|
||||
parser.add_rule(Box::new(ParagraphRule::new()), None).unwrap();
|
||||
parser.add_rule(Box::new(ImportRule::new()), None).unwrap();
|
||||
parser.add_rule(Box::new(ScriptRule::new()), None).unwrap();
|
||||
parser.add_rule(Box::new(VariableRule::new()), None).unwrap();
|
||||
parser.add_rule(Box::new(VariableSubstitutionRule::new()), None).unwrap();
|
||||
parser.add_rule(Box::new(RawRule::new()), None).unwrap();
|
||||
parser.add_rule(Box::new(ListRule::new()), None).unwrap();
|
||||
parser.add_rule(Box::new(CodeRule::new()), None).unwrap();
|
||||
parser.add_rule(Box::new(TexRule::new()), None).unwrap();
|
||||
parser.add_rule(Box::new(GraphRule::new()), None).unwrap();
|
||||
parser.add_rule(Box::new(MediaRule::new()), None).unwrap();
|
||||
|
||||
pub fn register<P: Parser>(parser: &mut P)
|
||||
{
|
||||
parser.add_rule(Box::new(CommentRule::new()), None);
|
||||
parser.add_rule(Box::new(ParagraphRule::new()), None);
|
||||
parser.add_rule(Box::new(ImportRule::new()), None);
|
||||
parser.add_rule(Box::new(ScriptRule::new()), None);
|
||||
parser.add_rule(Box::new(VariableRule::new()), None);
|
||||
parser.add_rule(Box::new(VariableSubstitutionRule::new()), None);
|
||||
parser.add_rule(Box::new(RawRule::new()), None);
|
||||
parser.add_rule(Box::new(ListRule::new()), None);
|
||||
parser.add_rule(Box::new(CodeRule::new()), None);
|
||||
parser.add_rule(Box::new(TexRule::new()), None);
|
||||
|
||||
parser.add_rule(Box::new(StyleRule::new()), None);
|
||||
parser.add_rule(Box::new(SectionRule::new()), None);
|
||||
parser.add_rule(Box::new(LinkRule::new()), None);
|
||||
parser.add_rule(Box::new(TextRule::default()), None);
|
||||
parser.add_rule(Box::new(StyleRule::new()), None).unwrap();
|
||||
parser.add_rule(Box::new(SectionRule::new()), None).unwrap();
|
||||
parser.add_rule(Box::new(LinkRule::new()), None).unwrap();
|
||||
parser.add_rule(Box::new(TextRule::default()), None).unwrap();
|
||||
}
|
||||
|
|
|
@ -1,16 +1,29 @@
|
|||
use mlua::{Function, Lua};
|
||||
use regex::{Captures, Regex};
|
||||
use crate::{lua::kernel::{Kernel, KernelContext, KernelHolder}, parser::{parser::{Parser, ReportColors}, rule::RegexRule, source::{Source, Token, VirtualSource}, util}};
|
||||
use ariadne::{Fmt, Label, Report, ReportKind};
|
||||
use crate::document::document::Document;
|
||||
use std::{ops::Range, rc::Rc};
|
||||
use crate::lua::kernel::Kernel;
|
||||
use crate::lua::kernel::KernelContext;
|
||||
use crate::parser::parser::Parser;
|
||||
use crate::parser::parser::ReportColors;
|
||||
use crate::parser::rule::RegexRule;
|
||||
use crate::parser::source::Source;
|
||||
use crate::parser::source::Token;
|
||||
use crate::parser::source::VirtualSource;
|
||||
use crate::parser::util;
|
||||
use ariadne::Fmt;
|
||||
use ariadne::Label;
|
||||
use ariadne::Report;
|
||||
use ariadne::ReportKind;
|
||||
use mlua::Function;
|
||||
use mlua::Lua;
|
||||
use regex::Captures;
|
||||
use regex::Regex;
|
||||
use std::ops::Range;
|
||||
use std::rc::Rc;
|
||||
|
||||
use super::text::Text;
|
||||
|
||||
pub struct ScriptRule
|
||||
{
|
||||
pub struct ScriptRule {
|
||||
re: [Regex; 2],
|
||||
eval_kinds: [(&'static str, &'static str); 3]
|
||||
eval_kinds: [(&'static str, &'static str); 3],
|
||||
}
|
||||
|
||||
impl ScriptRule {
|
||||
|
@ -18,57 +31,69 @@ impl ScriptRule {
|
|||
Self {
|
||||
re: [
|
||||
Regex::new(r"(?:^|\n)@<(?:(.*)\n?)((?:\\.|[^\\\\])*?)(?:\n?)>@").unwrap(),
|
||||
Regex::new(r"%<([^\s[:alpha:]])?(?:\[(.*?)\])?((?:\\.|[^\\\\])*?)(?:\n?)>%").unwrap()
|
||||
Regex::new(r"%<(?:\[(.*?)\])?([^\s[:alpha:]])?((?:\\.|[^\\\\])*?)(?:\n?)>%")
|
||||
.unwrap(),
|
||||
],
|
||||
eval_kinds: [
|
||||
("", "Eval"),
|
||||
("\"", "Eval to text"),
|
||||
("!", "Eval and parse"),
|
||||
]
|
||||
],
|
||||
}
|
||||
}
|
||||
|
||||
fn validate_kernel_name(colors: &ReportColors, name: &str)
|
||||
-> Result<String, String> {
|
||||
fn validate_kernel_name(colors: &ReportColors, name: &str) -> Result<String, String> {
|
||||
let trimmed = name.trim_end().trim_start();
|
||||
if trimmed.is_empty() { return Ok("main".to_string()) }
|
||||
else if trimmed.find(|c: char| c.is_whitespace()).is_some() {
|
||||
return Err(format!("Kernel name `{}` contains whitespaces",
|
||||
trimmed.fg(colors.highlight)))
|
||||
if trimmed.is_empty() {
|
||||
return Ok("main".to_string());
|
||||
} else if trimmed.find(|c: char| c.is_whitespace()).is_some() {
|
||||
return Err(format!(
|
||||
"Kernel name `{}` contains whitespaces",
|
||||
trimmed.fg(colors.highlight)
|
||||
));
|
||||
}
|
||||
|
||||
Ok(trimmed.to_string())
|
||||
}
|
||||
|
||||
fn validate_kind(&self, colors: &ReportColors, kind: &str)
|
||||
-> Result<usize, String> {
|
||||
match self.eval_kinds.iter().position(|(kind_symbol, _)| kind == *kind_symbol)
|
||||
fn validate_kind(&self, colors: &ReportColors, kind: &str) -> Result<usize, String> {
|
||||
match self
|
||||
.eval_kinds
|
||||
.iter()
|
||||
.position(|(kind_symbol, _)| kind == *kind_symbol)
|
||||
{
|
||||
Some(id) => Ok(id),
|
||||
None => Err(format!("Unable to find eval kind `{}`. Available kinds:{}",
|
||||
None => Err(format!(
|
||||
"Unable to find eval kind `{}`. Available kinds:{}",
|
||||
kind.fg(colors.highlight),
|
||||
self.eval_kinds.iter().fold(String::new(), |out, (symbol, name)| {
|
||||
self.eval_kinds
|
||||
.iter()
|
||||
.fold(String::new(), |out, (symbol, name)| {
|
||||
out + format!("\n - '{symbol}' => {name}").as_str()
|
||||
})))
|
||||
})
|
||||
)),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl RegexRule for ScriptRule
|
||||
{
|
||||
impl RegexRule for ScriptRule {
|
||||
fn name(&self) -> &'static str { "Script" }
|
||||
|
||||
fn regexes(&self) -> &[regex::Regex] { &self.re }
|
||||
|
||||
fn on_regex_match(&self, index: usize, parser: &dyn Parser, document: &Document, token: Token, matches: Captures)
|
||||
-> Vec<Report<'_, (Rc<dyn Source>, Range<usize>)>> {
|
||||
fn on_regex_match<'a>(
|
||||
&self,
|
||||
index: usize,
|
||||
parser: &dyn Parser,
|
||||
document: &'a dyn Document<'a>,
|
||||
token: Token,
|
||||
matches: Captures,
|
||||
) -> Vec<Report<'_, (Rc<dyn Source>, Range<usize>)>> {
|
||||
let mut reports = vec![];
|
||||
|
||||
let kernel_name = match matches.get(if index == 0 {1} else {2}) {
|
||||
let kernel_name = match matches.get(1) {
|
||||
None => "main".to_string(),
|
||||
Some(name) => {
|
||||
match ScriptRule::validate_kernel_name(parser.colors(), name.as_str())
|
||||
{
|
||||
Some(name) => match ScriptRule::validate_kernel_name(parser.colors(), name.as_str()) {
|
||||
Ok(name) => name,
|
||||
Err(e) => {
|
||||
reports.push(
|
||||
|
@ -77,77 +102,83 @@ impl RegexRule for ScriptRule
|
|||
.with_label(
|
||||
Label::new((token.source(), name.range()))
|
||||
.with_message(e)
|
||||
.with_color(parser.colors().error))
|
||||
.finish());
|
||||
.with_color(parser.colors().error),
|
||||
)
|
||||
.finish(),
|
||||
);
|
||||
return reports;
|
||||
}
|
||||
}
|
||||
}
|
||||
},
|
||||
};
|
||||
let kernel_name = matches.get(if index == 0 {1} else {2})
|
||||
.and_then(|name| {
|
||||
let trimmed = name.as_str().trim_start().trim_end();
|
||||
(!trimmed.is_empty()).then_some(trimmed)
|
||||
})
|
||||
.unwrap_or("main");
|
||||
let kernel = parser.get_kernel(kernel_name).unwrap_or_else(|| {
|
||||
parser.insert_kernel(kernel_name.to_string(), Kernel::new(parser))
|
||||
});
|
||||
let kernel = parser
|
||||
.get_kernel(kernel_name.as_str())
|
||||
.unwrap_or_else(|| parser.insert_kernel(kernel_name.to_string(), Kernel::new(parser)));
|
||||
|
||||
let kernel_data = matches.get(if index == 0 {2} else {3})
|
||||
let kernel_data = matches
|
||||
.get(if index == 0 { 2 } else { 3 })
|
||||
.and_then(|code| {
|
||||
let trimmed = code.as_str().trim_start().trim_end();
|
||||
(!trimmed.is_empty()).then_some((trimmed, code.range()))
|
||||
}).or_else(|| {
|
||||
})
|
||||
.or_else(|| {
|
||||
reports.push(
|
||||
Report::build(ReportKind::Warning, token.source(), token.start())
|
||||
.with_message("Invalid kernel code")
|
||||
.with_label(
|
||||
Label::new((token.source(), token.start()+1..token.end()))
|
||||
Label::new((token.source(), token.start() + 1..token.end()))
|
||||
.with_message("Kernel code is empty")
|
||||
.with_color(parser.colors().warning))
|
||||
.finish());
|
||||
.with_color(parser.colors().warning),
|
||||
)
|
||||
.finish(),
|
||||
);
|
||||
|
||||
None
|
||||
});
|
||||
|
||||
if kernel_data.is_none() { return reports; }
|
||||
if kernel_data.is_none() {
|
||||
return reports;
|
||||
}
|
||||
|
||||
let (kernel_content, kernel_range) = kernel_data.unwrap();
|
||||
let source = Rc::new(VirtualSource::new(
|
||||
Token::new(kernel_range, token.source()),
|
||||
format!("{}#{}:lua_kernel@{kernel_name}", token.source().name(), matches.get(0).unwrap().start()),
|
||||
util::process_escaped('\\', ">@", kernel_content)
|
||||
format!(
|
||||
"{}#{}:lua_kernel@{kernel_name}",
|
||||
token.source().name(),
|
||||
matches.get(0).unwrap().start()
|
||||
),
|
||||
util::process_escaped('\\', ">@", kernel_content),
|
||||
)) as Rc<dyn Source>;
|
||||
|
||||
let execute = |lua: &Lua|
|
||||
{
|
||||
let chunk = lua.load(source.content())
|
||||
.set_name(kernel_name);
|
||||
let execute = |lua: &Lua| {
|
||||
let chunk = lua.load(source.content()).set_name(kernel_name);
|
||||
|
||||
if index == 0 // Exec
|
||||
{
|
||||
if let Err(e) = chunk.exec()
|
||||
if index == 0
|
||||
// Exec
|
||||
{
|
||||
if let Err(e) = chunk.exec() {
|
||||
reports.push(
|
||||
Report::build(ReportKind::Error, source.clone(), 0)
|
||||
.with_message("Invalid kernel code")
|
||||
.with_label(
|
||||
Label::new((source.clone(), 0..source.content().len()))
|
||||
.with_message(format!("Kernel execution failed:\n{}", e.to_string()))
|
||||
.with_color(parser.colors().error))
|
||||
.finish());
|
||||
.with_message(format!(
|
||||
"Kernel execution failed:\n{}",
|
||||
e.to_string()
|
||||
))
|
||||
.with_color(parser.colors().error),
|
||||
)
|
||||
.finish(),
|
||||
);
|
||||
return reports;
|
||||
}
|
||||
}
|
||||
else // Eval
|
||||
} else
|
||||
// Eval
|
||||
{
|
||||
// Validate kind
|
||||
let kind = match matches.get(1) {
|
||||
let kind = match matches.get(2) {
|
||||
None => 0,
|
||||
Some(kind) => {
|
||||
match self.validate_kind(parser.colors(), kind.as_str())
|
||||
{
|
||||
Some(kind) => match self.validate_kind(parser.colors(), kind.as_str()) {
|
||||
Ok(kind) => kind,
|
||||
Err(msg) => {
|
||||
reports.push(
|
||||
|
@ -156,63 +187,76 @@ impl RegexRule for ScriptRule
|
|||
.with_label(
|
||||
Label::new((token.source(), kind.range()))
|
||||
.with_message(msg)
|
||||
.with_color(parser.colors().error))
|
||||
.finish());
|
||||
.with_color(parser.colors().error),
|
||||
)
|
||||
.finish(),
|
||||
);
|
||||
return reports;
|
||||
}
|
||||
}
|
||||
}
|
||||
},
|
||||
};
|
||||
|
||||
if kind == 0 // Eval
|
||||
{
|
||||
if let Err(e) = chunk.eval::<()>()
|
||||
if kind == 0
|
||||
// Eval
|
||||
{
|
||||
if let Err(e) = chunk.eval::<()>() {
|
||||
reports.push(
|
||||
Report::build(ReportKind::Error, source.clone(), 0)
|
||||
.with_message("Invalid kernel code")
|
||||
.with_label(
|
||||
Label::new((source.clone(), 0..source.content().len()))
|
||||
.with_message(format!("Kernel evaluation failed:\n{}", e.to_string()))
|
||||
.with_color(parser.colors().error))
|
||||
.finish());
|
||||
.with_message(format!(
|
||||
"Kernel evaluation failed:\n{}",
|
||||
e.to_string()
|
||||
))
|
||||
.with_color(parser.colors().error),
|
||||
)
|
||||
.finish(),
|
||||
);
|
||||
}
|
||||
}
|
||||
else // Eval to string
|
||||
{
|
||||
match chunk.eval::<String>()
|
||||
} else
|
||||
// Eval to string
|
||||
{
|
||||
match chunk.eval::<String>() {
|
||||
Ok(result) => {
|
||||
if kind == 1 // Eval to text
|
||||
if kind == 1
|
||||
// Eval to text
|
||||
{
|
||||
if !result.is_empty()
|
||||
{
|
||||
parser.push(document, Box::new(Text::new(
|
||||
if !result.is_empty() {
|
||||
parser.push(
|
||||
document,
|
||||
Box::new(Text::new(
|
||||
Token::new(1..source.content().len(), source.clone()),
|
||||
util::process_text(document, result.as_str()),
|
||||
)));
|
||||
)),
|
||||
);
|
||||
}
|
||||
}
|
||||
else if kind == 2 // Eval and Parse
|
||||
} else if kind == 2
|
||||
// Eval and Parse
|
||||
{
|
||||
let parse_source = Rc::new(VirtualSource::new(
|
||||
Token::new(0..source.content().len(), source.clone()),
|
||||
format!("parse({})", source.name()),
|
||||
result
|
||||
result,
|
||||
)) as Rc<dyn Source>;
|
||||
|
||||
parser.parse_into(parse_source, document);
|
||||
}
|
||||
},
|
||||
}
|
||||
Err(e) => {
|
||||
reports.push(
|
||||
Report::build(ReportKind::Error, source.clone(), 0)
|
||||
.with_message("Invalid kernel code")
|
||||
.with_label(
|
||||
Label::new((source.clone(), 0..source.content().len()))
|
||||
.with_message(format!("Kernel evaluation failed:\n{}", e.to_string()))
|
||||
.with_color(parser.colors().error))
|
||||
.finish());
|
||||
.with_message(format!(
|
||||
"Kernel evaluation failed:\n{}",
|
||||
e.to_string()
|
||||
))
|
||||
.with_color(parser.colors().error),
|
||||
)
|
||||
.finish(),
|
||||
);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
@ -224,7 +268,7 @@ impl RegexRule for ScriptRule
|
|||
let ctx = KernelContext {
|
||||
location: Token::new(0..source.content().len(), source.clone()),
|
||||
parser,
|
||||
document
|
||||
document,
|
||||
};
|
||||
|
||||
kernel.run_with_context(ctx, execute)
|
||||
|
|
|
@ -1,8 +1,8 @@
|
|||
use mlua::{Error::BadArgument, Function, Lua};
|
||||
use regex::Regex;
|
||||
use crate::{compiler::compiler::Target, lua::kernel::CTX, parser::{parser::Parser, rule::RegexRule, source::{Source, Token}}};
|
||||
use crate::{compiler::compiler::Target, document::document::Document, lua::kernel::CTX, parser::{parser::Parser, rule::RegexRule, source::{Source, Token}}};
|
||||
use ariadne::{Report, Fmt, Label, ReportKind};
|
||||
use crate::{compiler::compiler::Compiler, document::{document::Document, element::{ElemKind, Element, ReferenceableElement}}};
|
||||
use crate::{compiler::compiler::Compiler, document::element::{ElemKind, Element, ReferenceableElement}};
|
||||
use std::{ops::Range, rc::Rc, sync::Arc};
|
||||
|
||||
#[derive(Debug)]
|
||||
|
@ -21,7 +21,7 @@ impl Element for Section
|
|||
fn element_name(&self) -> &'static str { "Section" }
|
||||
fn to_string(&self) -> String { format!("{self:#?}") }
|
||||
fn as_referenceable(&self) -> Option<&dyn ReferenceableElement> { Some(self) }
|
||||
fn compile(&self, compiler: &Compiler, _document: &Document) -> Result<String, String> {
|
||||
fn compile(&self, compiler: &Compiler, _document: &dyn Document) -> Result<String, String> {
|
||||
match compiler.target()
|
||||
{
|
||||
Target::HTML => {
|
||||
|
@ -61,7 +61,7 @@ impl RegexRule for SectionRule {
|
|||
|
||||
fn regexes(&self) -> &[Regex] { &self.re }
|
||||
|
||||
fn on_regex_match(&self, _: usize, parser: &dyn Parser, document: &Document, token: Token, matches: regex::Captures) -> Vec<Report<'_, (Rc<dyn Source>, Range<usize>)>>
|
||||
fn on_regex_match(&self, _: usize, parser: &dyn Parser, document: &dyn Document, token: Token, matches: regex::Captures) -> Vec<Report<'_, (Rc<dyn Source>, Range<usize>)>>
|
||||
{
|
||||
let mut result = vec![];
|
||||
let section_depth = match matches.get(1)
|
||||
|
@ -89,6 +89,7 @@ impl RegexRule for SectionRule {
|
|||
// [Optional] Reference name
|
||||
let section_refname = matches.get(2).map_or_else(|| None,
|
||||
|refname| {
|
||||
/* TODO: Wait for reference rework
|
||||
// Check for duplicate reference
|
||||
if let Some((ref_doc, reference)) = document.get_reference(refname.as_str())
|
||||
{
|
||||
|
@ -112,6 +113,7 @@ impl RegexRule for SectionRule {
|
|||
.with_note(format!("Previous reference was overwritten"))
|
||||
.finish());
|
||||
}
|
||||
*/
|
||||
Some(refname.as_str().to_string())
|
||||
});
|
||||
|
||||
|
|
|
@ -1,8 +1,7 @@
|
|||
use mlua::{Function, Lua};
|
||||
use regex::{Captures, Regex};
|
||||
use crate::{compiler::compiler::{Compiler, Target}, document::element::{ElemKind, Element}, parser::{parser::Parser, rule::RegexRule, source::{Source, Token}, state::State}};
|
||||
use crate::{compiler::compiler::{Compiler, Target}, document::{document::{DocumentAccessors, Document}, element::{ElemKind, Element}}, parser::{parser::Parser, rule::RegexRule, source::{Source, Token}, state::State}};
|
||||
use ariadne::{Fmt, Label, Report, ReportKind};
|
||||
use crate::document::document::Document;
|
||||
use crate::parser::state::Scope;
|
||||
use std::{cell::RefCell, ops::Range, rc::Rc};
|
||||
use lazy_static::lazy_static;
|
||||
|
@ -29,7 +28,7 @@ impl Element for Style
|
|||
fn kind(&self) -> ElemKind { ElemKind::Inline }
|
||||
fn element_name(&self) -> &'static str { "Section" }
|
||||
fn to_string(&self) -> String { format!("{self:#?}") }
|
||||
fn compile(&self, compiler: &Compiler, _document: &Document) -> Result<String, String> {
|
||||
fn compile(&self, compiler: &Compiler, _document: &dyn Document) -> Result<String, String> {
|
||||
match compiler.target()
|
||||
{
|
||||
Target::HTML => {
|
||||
|
@ -66,7 +65,7 @@ impl State for StyleState
|
|||
{
|
||||
fn scope(&self) -> Scope { Scope::PARAGRAPH }
|
||||
|
||||
fn on_remove<'a>(&self, parser: &dyn Parser, document: &Document) -> Vec<Report<'a, (Rc<dyn Source>, Range<usize>)>> {
|
||||
fn on_remove<'a>(&self, parser: &dyn Parser, document: &dyn Document) -> Vec<Report<'a, (Rc<dyn Source>, Range<usize>)>> {
|
||||
let mut result = Vec::new();
|
||||
self.toggled
|
||||
.iter()
|
||||
|
@ -80,7 +79,7 @@ impl State for StyleState
|
|||
|
||||
//let active_range = range.start .. paragraph.location().end()-1;
|
||||
|
||||
let paragraph = document.last_element::<Paragraph>(false).unwrap();
|
||||
let paragraph = document.last_element::<Paragraph>().unwrap();
|
||||
let paragraph_end = paragraph.content.last()
|
||||
.and_then(|last| Some((last.location().source(), last.location().end()-1 .. last.location().end())))
|
||||
.unwrap();
|
||||
|
@ -145,7 +144,7 @@ impl RegexRule for StyleRule
|
|||
|
||||
fn regexes(&self) -> &[regex::Regex] { &self.re }
|
||||
|
||||
fn on_regex_match(&self, index: usize, parser: &dyn Parser, document: &Document, token: Token, _matches: Captures) -> Vec<Report<(Rc<dyn Source>, Range<usize>)>> {
|
||||
fn on_regex_match(&self, index: usize, parser: &dyn Parser, document: &dyn Document, token: Token, _matches: Captures) -> Vec<Report<(Rc<dyn Source>, Range<usize>)>> {
|
||||
let result = vec![];
|
||||
|
||||
let query = parser.state().query(&STATE_NAME);
|
||||
|
|
|
@ -1,32 +1,52 @@
|
|||
use std::{io::{Read, Write}, ops::Range, process::{Command, Stdio}, rc::Rc, sync::Once};
|
||||
use std::io::Read;
|
||||
use std::io::Write;
|
||||
use std::ops::Range;
|
||||
use std::process::Command;
|
||||
use std::process::Stdio;
|
||||
use std::rc::Rc;
|
||||
use std::sync::Once;
|
||||
|
||||
use ariadne::{Fmt, Label, Report, ReportKind};
|
||||
use crypto::{digest::Digest, sha2::Sha512};
|
||||
use mlua::{Function, Lua};
|
||||
use regex::{Captures, Regex};
|
||||
use ariadne::Fmt;
|
||||
use ariadne::Label;
|
||||
use ariadne::Report;
|
||||
use ariadne::ReportKind;
|
||||
use crypto::digest::Digest;
|
||||
use crypto::sha2::Sha512;
|
||||
use mlua::Function;
|
||||
use mlua::Lua;
|
||||
use regex::Captures;
|
||||
use regex::Regex;
|
||||
|
||||
use crate::{cache::cache::{Cached, CachedError}, compiler::compiler::{Compiler, Target}, document::{document::Document, element::{ElemKind, Element}}, parser::{parser::Parser, rule::RegexRule, source::{Source, Token}, util}};
|
||||
use crate::cache::cache::Cached;
|
||||
use crate::cache::cache::CachedError;
|
||||
use crate::compiler::compiler::Compiler;
|
||||
use crate::compiler::compiler::Target;
|
||||
use crate::document::document::Document;
|
||||
use crate::document::element::ElemKind;
|
||||
use crate::document::element::Element;
|
||||
use crate::parser::parser::Parser;
|
||||
use crate::parser::rule::RegexRule;
|
||||
use crate::parser::source::Source;
|
||||
use crate::parser::source::Token;
|
||||
use crate::parser::util;
|
||||
|
||||
#[derive(Debug, PartialEq, Eq)]
|
||||
enum TexKind
|
||||
{
|
||||
enum TexKind {
|
||||
Block,
|
||||
Inline,
|
||||
}
|
||||
|
||||
impl From<&TexKind> for ElemKind
|
||||
{
|
||||
impl From<&TexKind> for ElemKind {
|
||||
fn from(value: &TexKind) -> Self {
|
||||
match value {
|
||||
TexKind::Inline => ElemKind::Inline,
|
||||
_ => ElemKind::Block
|
||||
_ => ElemKind::Block,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Debug)]
|
||||
struct Tex
|
||||
{
|
||||
struct Tex {
|
||||
location: Token,
|
||||
block: TexKind,
|
||||
env: String,
|
||||
|
@ -35,49 +55,59 @@ struct Tex
|
|||
}
|
||||
|
||||
impl Tex {
|
||||
fn new(location: Token, block: TexKind, env: String, tex: String, caption: Option<String>) -> Self {
|
||||
Self { location, block, env, tex, caption }
|
||||
fn new(
|
||||
location: Token,
|
||||
block: TexKind,
|
||||
env: String,
|
||||
tex: String,
|
||||
caption: Option<String>,
|
||||
) -> Self {
|
||||
Self {
|
||||
location,
|
||||
block,
|
||||
env,
|
||||
tex,
|
||||
caption,
|
||||
}
|
||||
}
|
||||
|
||||
fn format_latex(fontsize: &String, preamble: &String, tex: &String) -> FormattedTex
|
||||
{
|
||||
FormattedTex(format!(r"\documentclass[{}pt,preview]{{standalone}}
|
||||
fn format_latex(fontsize: &String, preamble: &String, tex: &String) -> FormattedTex {
|
||||
FormattedTex(format!(
|
||||
r"\documentclass[{}pt,preview]{{standalone}}
|
||||
{}
|
||||
\begin{{document}}
|
||||
\begin{{preview}}
|
||||
{}
|
||||
\end{{preview}}
|
||||
\end{{document}}",
|
||||
fontsize, preamble, tex))
|
||||
fontsize, preamble, tex
|
||||
))
|
||||
}
|
||||
}
|
||||
|
||||
struct FormattedTex(String);
|
||||
|
||||
impl FormattedTex
|
||||
{
|
||||
impl FormattedTex {
|
||||
/// Renders latex to svg
|
||||
fn latex_to_svg(&self, exec: &String, fontsize: &String) -> Result<String, String>
|
||||
{
|
||||
fn latex_to_svg(&self, exec: &String, fontsize: &String) -> Result<String, String> {
|
||||
print!("Rendering LaTex `{}`... ", self.0);
|
||||
let process = match Command::new(exec)
|
||||
.arg("--fontsize").arg(fontsize)
|
||||
.arg("--fontsize")
|
||||
.arg(fontsize)
|
||||
.stdout(Stdio::piped())
|
||||
.stdin(Stdio::piped())
|
||||
.spawn()
|
||||
{
|
||||
Err(e) => return Err(format!("Could not spawn `{exec}`: {}", e)),
|
||||
Ok(process) => process
|
||||
Ok(process) => process,
|
||||
};
|
||||
|
||||
if let Err(e) = process.stdin.unwrap().write_all(self.0.as_bytes())
|
||||
{
|
||||
if let Err(e) = process.stdin.unwrap().write_all(self.0.as_bytes()) {
|
||||
panic!("Unable to write to `latex2svg`'s stdin: {}", e);
|
||||
}
|
||||
|
||||
let mut result = String::new();
|
||||
match process.stdout.unwrap().read_to_string(&mut result)
|
||||
{
|
||||
match process.stdout.unwrap().read_to_string(&mut result) {
|
||||
Err(e) => panic!("Unable to read `latex2svg` stdout: {}", e),
|
||||
Ok(_) => {}
|
||||
}
|
||||
|
@ -87,8 +117,7 @@ impl FormattedTex
|
|||
}
|
||||
}
|
||||
|
||||
impl Cached for FormattedTex
|
||||
{
|
||||
impl Cached for FormattedTex {
|
||||
type Key = String;
|
||||
type Value = String;
|
||||
|
||||
|
@ -115,71 +144,74 @@ impl Cached for FormattedTex
|
|||
}
|
||||
|
||||
impl Element for Tex {
|
||||
fn location(&self) -> &Token { &self.location }
|
||||
fn location(&self) -> &Token {
|
||||
&self.location
|
||||
}
|
||||
|
||||
fn kind(&self) -> ElemKind { (&self.block).into() }
|
||||
fn kind(&self) -> ElemKind {
|
||||
(&self.block).into()
|
||||
}
|
||||
|
||||
fn element_name(&self) -> &'static str { "LaTeX" }
|
||||
fn element_name(&self) -> &'static str {
|
||||
"LaTeX"
|
||||
}
|
||||
|
||||
fn to_string(&self) -> String { format!("{self:#?}") }
|
||||
|
||||
fn compile(&self, compiler: &Compiler, document: &Document)
|
||||
-> Result<String, String> {
|
||||
fn to_string(&self) -> String {
|
||||
format!("{self:#?}")
|
||||
}
|
||||
|
||||
fn compile(&self, compiler: &Compiler, document: &dyn Document) -> Result<String, String> {
|
||||
match compiler.target() {
|
||||
Target::HTML => {
|
||||
static CACHE_INIT : Once = Once::new();
|
||||
CACHE_INIT.call_once(|| if let Some(mut con) = compiler.cache() {
|
||||
if let Err(e) = FormattedTex::init(&mut con)
|
||||
{
|
||||
static CACHE_INIT: Once = Once::new();
|
||||
CACHE_INIT.call_once(|| {
|
||||
if let Some(mut con) = compiler.cache() {
|
||||
if let Err(e) = FormattedTex::init(&mut con) {
|
||||
eprintln!("Unable to create cache table: {e}");
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
let exec = document.get_variable(format!("tex.{}.exec", self.env))
|
||||
.map_or("latex2svg".to_string(), |(_, var)| var.to_string());
|
||||
let exec = document
|
||||
.get_variable(format!("tex.{}.exec", self.env).as_str())
|
||||
.map_or("latex2svg".to_string(), |var| var.to_string());
|
||||
// FIXME: Because fontsize is passed as an arg, verify that it cannot be used to execute python/shell code
|
||||
let fontsize = document.get_variable(format!("tex.{}.fontsize", self.env))
|
||||
.map_or("12".to_string(), |(_, var)| var.to_string());
|
||||
let preamble = document.get_variable(format!("tex.{}.preamble", self.env))
|
||||
.map_or("".to_string(), |(_, var)| var.to_string());
|
||||
let prepend = if self.block == TexKind::Inline { "".to_string() }
|
||||
else
|
||||
{
|
||||
document.get_variable(format!("tex.{}.block_prepend", self.env))
|
||||
.map_or("".to_string(), |(_, var)| var.to_string()+"\n")
|
||||
let fontsize = document
|
||||
.get_variable(format!("tex.{}.fontsize", self.env).as_str())
|
||||
.map_or("12".to_string(), |var| var.to_string());
|
||||
let preamble = document
|
||||
.get_variable(format!("tex.{}.preamble", self.env).as_str())
|
||||
.map_or("".to_string(), |var| var.to_string());
|
||||
let prepend = if self.block == TexKind::Inline {
|
||||
"".to_string()
|
||||
} else {
|
||||
document
|
||||
.get_variable(format!("tex.{}.block_prepend", self.env).as_str())
|
||||
.map_or("".to_string(), |var| var.to_string() + "\n")
|
||||
};
|
||||
|
||||
let latex = match self.block
|
||||
{
|
||||
TexKind::Inline => Tex::format_latex(
|
||||
&fontsize,
|
||||
&preamble,
|
||||
&format!("${{{}}}$", self.tex)),
|
||||
_ => Tex::format_latex(
|
||||
&fontsize,
|
||||
&preamble,
|
||||
&format!("{prepend}{}", self.tex))
|
||||
let latex = match self.block {
|
||||
TexKind::Inline => {
|
||||
Tex::format_latex(&fontsize, &preamble, &format!("${{{}}}$", self.tex))
|
||||
}
|
||||
_ => Tex::format_latex(&fontsize, &preamble, &format!("{prepend}{}", self.tex)),
|
||||
};
|
||||
|
||||
if let Some(mut con) = compiler.cache()
|
||||
{
|
||||
match latex.cached(&mut con, |s| s.latex_to_svg(&exec, &fontsize))
|
||||
{
|
||||
if let Some(mut con) = compiler.cache() {
|
||||
match latex.cached(&mut con, |s| s.latex_to_svg(&exec, &fontsize)) {
|
||||
Ok(s) => Ok(s),
|
||||
Err(e) => match e
|
||||
{
|
||||
CachedError::SqlErr(e) => Err(format!("Querying the cache failed: {e}")),
|
||||
CachedError::GenErr(e) => Err(e)
|
||||
Err(e) => match e {
|
||||
CachedError::SqlErr(e) => {
|
||||
Err(format!("Querying the cache failed: {e}"))
|
||||
}
|
||||
CachedError::GenErr(e) => Err(e),
|
||||
},
|
||||
}
|
||||
}
|
||||
else
|
||||
{
|
||||
} else {
|
||||
latex.latex_to_svg(&exec, &fontsize)
|
||||
}
|
||||
}
|
||||
_ => todo!("Unimplemented")
|
||||
_ => todo!("Unimplemented"),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
@ -199,23 +231,32 @@ impl TexRule {
|
|||
}
|
||||
}
|
||||
|
||||
impl RegexRule for TexRule
|
||||
{
|
||||
fn name(&self) -> &'static str { "Tex" }
|
||||
impl RegexRule for TexRule {
|
||||
fn name(&self) -> &'static str {
|
||||
"Tex"
|
||||
}
|
||||
|
||||
fn regexes(&self) -> &[regex::Regex] { &self.re }
|
||||
fn regexes(&self) -> &[regex::Regex] {
|
||||
&self.re
|
||||
}
|
||||
|
||||
fn on_regex_match(&self, index: usize, parser: &dyn Parser, document: &Document, token: Token, matches: Captures)
|
||||
-> Vec<Report<'_, (Rc<dyn Source>, Range<usize>)>> {
|
||||
fn on_regex_match(
|
||||
&self,
|
||||
index: usize,
|
||||
parser: &dyn Parser,
|
||||
document: &dyn Document,
|
||||
token: Token,
|
||||
matches: Captures,
|
||||
) -> Vec<Report<'_, (Rc<dyn Source>, Range<usize>)>> {
|
||||
let mut reports = vec![];
|
||||
|
||||
let tex_env = matches.get(1)
|
||||
let tex_env = matches
|
||||
.get(1)
|
||||
.and_then(|env| Some(env.as_str().trim_start().trim_end()))
|
||||
.and_then(|env| (!env.is_empty()).then_some(env))
|
||||
.unwrap_or("main");
|
||||
|
||||
let tex_content = match matches.get(2)
|
||||
{
|
||||
let tex_content = match matches.get(2) {
|
||||
// Unterminated `$`
|
||||
None => {
|
||||
reports.push(
|
||||
|
@ -223,27 +264,35 @@ impl RegexRule for TexRule
|
|||
.with_message("Unterminated Tex Code")
|
||||
.with_label(
|
||||
Label::new((token.source().clone(), token.range.clone()))
|
||||
.with_message(format!("Missing terminating `{}` after first `{}`",
|
||||
.with_message(format!(
|
||||
"Missing terminating `{}` after first `{}`",
|
||||
["|$", "$"][index].fg(parser.colors().info),
|
||||
["$|", "$"][index].fg(parser.colors().info)))
|
||||
.with_color(parser.colors().error))
|
||||
.finish());
|
||||
["$|", "$"][index].fg(parser.colors().info)
|
||||
))
|
||||
.with_color(parser.colors().error),
|
||||
)
|
||||
.finish(),
|
||||
);
|
||||
return reports;
|
||||
}
|
||||
Some(content) => {
|
||||
let processed = util::process_escaped('\\', ["|$", "$"][index],
|
||||
content.as_str().trim_start().trim_end());
|
||||
let processed = util::process_escaped(
|
||||
'\\',
|
||||
["|$", "$"][index],
|
||||
content.as_str().trim_start().trim_end(),
|
||||
);
|
||||
|
||||
if processed.is_empty()
|
||||
{
|
||||
if processed.is_empty() {
|
||||
reports.push(
|
||||
Report::build(ReportKind::Warning, token.source(), content.start())
|
||||
.with_message("Empty Tex Code")
|
||||
.with_label(
|
||||
Label::new((token.source().clone(), content.range()))
|
||||
.with_message("Tex code is empty")
|
||||
.with_color(parser.colors().warning))
|
||||
.finish());
|
||||
.with_color(parser.colors().warning),
|
||||
)
|
||||
.finish(),
|
||||
);
|
||||
}
|
||||
processed
|
||||
}
|
||||
|
@ -251,17 +300,26 @@ impl RegexRule for TexRule
|
|||
|
||||
// TODO: Caption
|
||||
|
||||
parser.push(document, Box::new(Tex::new(
|
||||
parser.push(
|
||||
document,
|
||||
Box::new(Tex::new(
|
||||
token,
|
||||
if index == 1 { TexKind::Inline } else { TexKind::Block },
|
||||
if index == 1 {
|
||||
TexKind::Inline
|
||||
} else {
|
||||
TexKind::Block
|
||||
},
|
||||
tex_env.to_string(),
|
||||
tex_content,
|
||||
None,
|
||||
)));
|
||||
)),
|
||||
);
|
||||
|
||||
reports
|
||||
}
|
||||
|
||||
// TODO
|
||||
fn lua_bindings<'lua>(&self, _lua: &'lua Lua) -> Vec<(String, Function<'lua>)> { vec![] }
|
||||
fn lua_bindings<'lua>(&self, _lua: &'lua Lua) -> Vec<(String, Function<'lua>)> {
|
||||
vec![]
|
||||
}
|
||||
}
|
||||
|
|
|
@ -1,33 +1,44 @@
|
|||
use mlua::{Function, Lua};
|
||||
use std::any::Any;
|
||||
use std::ops::Range;
|
||||
use std::rc::Rc;
|
||||
|
||||
use crate::{compiler::compiler::Compiler, document::{document::Document, element::{ElemKind, Element}}, lua::kernel::CTX, parser::{rule::Rule, source::Token}};
|
||||
use ariadne::Report;
|
||||
use mlua::Function;
|
||||
use mlua::Lua;
|
||||
|
||||
use crate::compiler::compiler::Compiler;
|
||||
use crate::document::document::Document;
|
||||
use crate::document::element::ElemKind;
|
||||
use crate::document::element::Element;
|
||||
use crate::lua::kernel::CTX;
|
||||
use crate::parser::parser::Parser;
|
||||
use crate::parser::rule::Rule;
|
||||
use crate::parser::source::Cursor;
|
||||
use crate::parser::source::Source;
|
||||
use crate::parser::source::Token;
|
||||
|
||||
#[derive(Debug)]
|
||||
pub struct Text
|
||||
{
|
||||
pub struct Text {
|
||||
pub(self) location: Token,
|
||||
pub(self) content: String,
|
||||
}
|
||||
|
||||
impl Text
|
||||
{
|
||||
pub fn new(location: Token, content: String) -> Text
|
||||
{
|
||||
impl Text {
|
||||
pub fn new(location: Token, content: String) -> Text {
|
||||
Text {
|
||||
location: location,
|
||||
content: content
|
||||
content: content,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl Element for Text
|
||||
{
|
||||
impl Element for Text {
|
||||
fn location(&self) -> &Token { &self.location }
|
||||
fn kind(&self) -> ElemKind { ElemKind::Inline }
|
||||
fn element_name(&self) -> &'static str { "Text" }
|
||||
fn to_string(&self) -> String { format!("{self:#?}") }
|
||||
|
||||
fn compile(&self, compiler: &Compiler, _document: &Document) -> Result<String, String> {
|
||||
fn compile(&self, compiler: &Compiler, _document: &dyn Document) -> Result<String, String> {
|
||||
Ok(compiler.sanitize(self.content.as_str()))
|
||||
}
|
||||
}
|
||||
|
@ -35,28 +46,42 @@ impl Element for Text
|
|||
#[derive(Default)]
|
||||
pub struct TextRule;
|
||||
|
||||
impl Rule for TextRule
|
||||
{
|
||||
impl Rule for TextRule {
|
||||
fn name(&self) -> &'static str { "Text" }
|
||||
|
||||
fn next_match(&self, cursor: &crate::parser::source::Cursor) -> Option<(usize, Box<dyn std::any::Any>)> { None }
|
||||
fn next_match(&self, _cursor: &Cursor) -> Option<(usize, Box<dyn Any>)> { None }
|
||||
|
||||
fn on_match(&self, parser: &dyn crate::parser::parser::Parser, document: &crate::document::document::Document, cursor: crate::parser::source::Cursor, match_data: Option<Box<dyn std::any::Any>>) -> (crate::parser::source::Cursor, Vec<ariadne::Report<'_, (std::rc::Rc<dyn crate::parser::source::Source>, std::ops::Range<usize>)>>) { panic!("Text canno match"); }
|
||||
fn on_match(
|
||||
&self,
|
||||
_parser: &dyn Parser,
|
||||
_document: &dyn Document,
|
||||
_cursor: Cursor,
|
||||
_match_data: Option<Box<dyn Any>>,
|
||||
) -> (Cursor, Vec<Report<'_, (Rc<dyn Source>, Range<usize>)>>) {
|
||||
panic!("Text cannot match");
|
||||
}
|
||||
|
||||
fn lua_bindings<'lua>(&self, lua: &'lua Lua) -> Vec<(String, Function<'lua>)> {
|
||||
let mut bindings = vec![];
|
||||
|
||||
bindings.push(("push".to_string(), lua.create_function(
|
||||
|_, content: String| {
|
||||
CTX.with_borrow(|ctx| ctx.as_ref().map(|ctx| {
|
||||
ctx.parser.push(ctx.document, Box::new(Text {
|
||||
bindings.push((
|
||||
"push".to_string(),
|
||||
lua.create_function(|_, content: String| {
|
||||
CTX.with_borrow(|ctx| {
|
||||
ctx.as_ref().map(|ctx| {
|
||||
ctx.parser.push(
|
||||
ctx.document,
|
||||
Box::new(Text {
|
||||
location: ctx.location.clone(),
|
||||
content,
|
||||
}));
|
||||
}));
|
||||
}),
|
||||
);
|
||||
})
|
||||
});
|
||||
|
||||
Ok(())
|
||||
}).unwrap()));
|
||||
})
|
||||
.unwrap(),
|
||||
));
|
||||
|
||||
bindings
|
||||
}
|
||||
|
|
|
@ -1,8 +1,8 @@
|
|||
use mlua::{Function, Lua};
|
||||
use regex::Regex;
|
||||
use crate::parser::{parser::{Parser, ReportColors}, rule::RegexRule, source::{Source, Token}};
|
||||
use crate::{document::document::Document, parser::{parser::{Parser, ReportColors}, rule::RegexRule, source::{Source, Token}}};
|
||||
use ariadne::{Report, Fmt, Label, ReportKind};
|
||||
use crate::document::{document::Document, variable::{BaseVariable, PathVariable, Variable}};
|
||||
use crate::document::variable::{BaseVariable, PathVariable, Variable};
|
||||
use std::{ops::Range, rc::Rc};
|
||||
|
||||
pub struct VariableRule {
|
||||
|
@ -91,7 +91,7 @@ impl RegexRule for VariableRule {
|
|||
|
||||
|
||||
|
||||
fn on_regex_match(&self, _: usize, parser: &dyn Parser, document: &Document, token: Token, matches: regex::Captures) -> Vec<Report<'_, (Rc<dyn Source>, Range<usize>)>>
|
||||
fn on_regex_match<'a>(&self, _: usize, parser: &dyn Parser, document: &'a dyn Document, token: Token, matches: regex::Captures) -> Vec<Report<'_, (Rc<dyn Source>, Range<usize>)>>
|
||||
{
|
||||
let mut result = vec![];
|
||||
// [Optional] variable kind
|
||||
|
@ -223,7 +223,7 @@ impl RegexRule for VariableSubstitutionRule
|
|||
|
||||
fn regexes(&self) -> &[regex::Regex] { &self.re }
|
||||
|
||||
fn on_regex_match(&self, _index: usize, parser: &dyn Parser, document: &Document, token: Token, matches: regex::Captures) -> Vec<Report<'_, (Rc<dyn Source>, Range<usize>)>> {
|
||||
fn on_regex_match<'a>(&self, _index: usize, parser: &dyn Parser, document: &'a dyn Document<'a>, token: Token, matches: regex::Captures) -> Vec<Report<'_, (Rc<dyn Source>, Range<usize>)>> {
|
||||
let mut result = vec![];
|
||||
|
||||
let variable = match matches.get(1)
|
||||
|
@ -307,27 +307,13 @@ impl RegexRule for VariableSubstitutionRule
|
|||
.finish());
|
||||
return result;
|
||||
}
|
||||
Some((_, var)) => var,
|
||||
Some(var) => var,
|
||||
}
|
||||
},
|
||||
_ => panic!("Unknown error")
|
||||
};
|
||||
|
||||
variable.parse(token, parser, document);
|
||||
//let parsed = variable.parse(
|
||||
// token,
|
||||
// parser,
|
||||
// document
|
||||
//);
|
||||
////document.merge(parsed, None);
|
||||
//parsed.content.borrow_mut()
|
||||
// .drain(..)
|
||||
// .for_each(|elem| parser.push(document, elem));
|
||||
//parser.push(document, )
|
||||
|
||||
// TODO: Full rework of document
|
||||
// parser shound parse into previous document, and not into a new document
|
||||
// This should prevent having to sue `recurse: bool` in the last_element getters
|
||||
|
||||
return result;
|
||||
}
|
||||
|
|
|
@ -1,6 +1,6 @@
|
|||
use std::rc::Rc;
|
||||
use std::{cell::{RefCell, RefMut}, collections::HashMap, rc::Rc};
|
||||
|
||||
use crate::parser::source::{Cursor, Source};
|
||||
use crate::{document::{document::Document, element::Element}, lua::kernel::{Kernel, KernelHolder}, parser::{parser::{Parser, ReportColors}, rule::Rule, source::{Cursor, Source}, state::StateHolder}};
|
||||
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct LineCursor
|
||||
|
@ -56,26 +56,6 @@ impl LineCursor
|
|||
//eprintln!("({}, {c:#?}) ({} {})", self.pos, self.line, self.line_pos);
|
||||
prev = Some(c);
|
||||
}
|
||||
|
||||
/*
|
||||
self.source.content()
|
||||
.as_str()[start..pos+1]
|
||||
.char_indices()
|
||||
.for_each(|(at, c)| {
|
||||
self.pos = at+start;
|
||||
|
||||
if c == '\n'
|
||||
{
|
||||
self.line += 1;
|
||||
self.line_pos = 0;
|
||||
}
|
||||
else
|
||||
{
|
||||
self.line_pos += c.len_utf8();
|
||||
}
|
||||
|
||||
});
|
||||
*/
|
||||
}
|
||||
else if pos < self.pos
|
||||
{
|
||||
|
@ -114,3 +94,55 @@ impl From<&LineCursor> for Cursor
|
|||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Debug)]
|
||||
pub struct LsParser
|
||||
{
|
||||
rules: Vec<Box<dyn Rule>>,
|
||||
colors: ReportColors,
|
||||
|
||||
// Parser state
|
||||
pub state: RefCell<StateHolder>,
|
||||
pub kernels: RefCell<HashMap<String, Kernel>>,
|
||||
}
|
||||
|
||||
impl Parser for LsParser
|
||||
{
|
||||
fn colors(&self) -> &ReportColors { &self.colors }
|
||||
fn rules(&self) -> &Vec<Box<dyn Rule>> { &self.rules }
|
||||
fn rules_mut(&mut self) -> &mut Vec<Box<dyn Rule>> { &mut self.rules }
|
||||
|
||||
fn state(&self) -> std::cell::Ref<'_, StateHolder> { self.state.borrow() }
|
||||
fn state_mut(&self) -> std::cell::RefMut<'_, StateHolder> { self.state.borrow_mut() }
|
||||
|
||||
fn has_error(&self) -> bool { true }
|
||||
|
||||
fn push<'a>(&self, doc: &dyn Document, elem: Box<dyn Element>) {
|
||||
todo!()
|
||||
}
|
||||
|
||||
fn parse<'a>(&self, source: Rc<dyn Source>, parent: Option<&'a dyn Document<'a>>) -> Box<dyn Document<'a>+'a> {
|
||||
todo!()
|
||||
}
|
||||
|
||||
fn parse_into<'a>(&self, source: Rc<dyn Source>, document: &'a dyn Document<'a>) {
|
||||
todo!()
|
||||
}
|
||||
}
|
||||
|
||||
impl KernelHolder for LsParser
|
||||
{
|
||||
fn get_kernel(&self, name: &str)
|
||||
-> Option<RefMut<'_, Kernel>> {
|
||||
RefMut::filter_map(self.kernels.borrow_mut(),
|
||||
|map| map.get_mut(name)).ok()
|
||||
}
|
||||
|
||||
fn insert_kernel(&self, name: String, kernel: Kernel)
|
||||
-> RefMut<'_, Kernel> {
|
||||
//TODO do not get
|
||||
self.kernels.borrow_mut()
|
||||
.insert(name.clone(), kernel);
|
||||
self.get_kernel(name.as_str()).unwrap()
|
||||
}
|
||||
}
|
||||
|
|
|
@ -59,7 +59,7 @@ pub fn provide(semantic_tokens: &mut Vec<SemanticToken>, cursor: &mut LineCursor
|
|||
}
|
||||
}
|
||||
|
||||
pub fn semantic_token_from_document(document: &Document) -> Vec<SemanticToken>
|
||||
pub fn semantic_token_from_document(document: &dyn Document) -> Vec<SemanticToken>
|
||||
{
|
||||
let mut semantic_tokens = vec![];
|
||||
|
||||
|
@ -71,20 +71,36 @@ pub fn semantic_token_from_document(document: &Document) -> Vec<SemanticToken>
|
|||
source: source.clone()
|
||||
};
|
||||
|
||||
document.content.borrow()
|
||||
.iter()
|
||||
.for_each(|elem| {
|
||||
if let Some(paragraph) = elem.downcast_ref::<Paragraph>()
|
||||
{
|
||||
paragraph.content
|
||||
.iter()
|
||||
.for_each(|elem| provide(&mut semantic_tokens, &mut cursor, elem));
|
||||
}
|
||||
else
|
||||
{
|
||||
provide(&mut semantic_tokens, &mut cursor, elem);
|
||||
}
|
||||
semantic_tokens.push(SemanticToken {
|
||||
delta_line: 1,
|
||||
delta_start: 1,
|
||||
length: 5,
|
||||
token_type: 0,
|
||||
token_modifiers_bitset: 0,
|
||||
});
|
||||
|
||||
semantic_tokens.push(SemanticToken {
|
||||
delta_line: 1,
|
||||
delta_start: 1,
|
||||
length: 5,
|
||||
token_type: 1,
|
||||
token_modifiers_bitset: 0,
|
||||
});
|
||||
|
||||
//document.content.borrow()
|
||||
// .iter()
|
||||
// .for_each(|elem| {
|
||||
// if let Some(paragraph) = elem.downcast_ref::<Paragraph>()
|
||||
// {
|
||||
// paragraph.content
|
||||
// .iter()
|
||||
// .for_each(|elem| provide(&mut semantic_tokens, &mut cursor, elem));
|
||||
// }
|
||||
// else
|
||||
// {
|
||||
// provide(&mut semantic_tokens, &mut cursor, elem);
|
||||
// }
|
||||
// });
|
||||
|
||||
semantic_tokens
|
||||
}
|
||||
|
|
|
@ -1,19 +1,19 @@
|
|||
use std::{cell::{RefCell, RefMut}, rc::Rc};
|
||||
use std::cell::{RefCell, RefMut};
|
||||
|
||||
use mlua::{Error, FromLua, Lua, UserData, UserDataMethods};
|
||||
use mlua::Lua;
|
||||
|
||||
use crate::{document::document::Document, parser::{parser::Parser, source::Token}};
|
||||
|
||||
pub struct KernelContext<'a>
|
||||
pub struct KernelContext<'a, 'b>
|
||||
{
|
||||
pub location: Token,
|
||||
pub parser: &'a dyn Parser,
|
||||
pub document: &'a Document<'a>,
|
||||
pub document: &'b dyn Document<'b>,
|
||||
//pub parser: &'a dyn Parser,
|
||||
}
|
||||
|
||||
thread_local! {
|
||||
pub static CTX: RefCell<Option<KernelContext<'static>>> = RefCell::new(None);
|
||||
pub static CTX: RefCell<Option<KernelContext<'static, 'static>>> = RefCell::new(None);
|
||||
}
|
||||
|
||||
#[derive(Debug)]
|
||||
|
@ -23,7 +23,6 @@ pub struct Kernel
|
|||
}
|
||||
|
||||
impl Kernel {
|
||||
|
||||
// TODO: Take parser as arg and
|
||||
// iterate over the rules
|
||||
// to find export the bindings (if some)
|
||||
|
|
72
src/main.rs
72
src/main.rs
|
@ -1,16 +1,18 @@
|
|||
#![feature(char_indices_offset)]
|
||||
mod document;
|
||||
mod cache;
|
||||
mod compiler;
|
||||
mod parser;
|
||||
mod document;
|
||||
mod elements;
|
||||
mod lua;
|
||||
mod cache;
|
||||
mod parser;
|
||||
|
||||
use std::{env, rc::Rc};
|
||||
use std::env;
|
||||
use std::rc::Rc;
|
||||
|
||||
use compiler::compiler::Compiler;
|
||||
use getopts::Options;
|
||||
use parser::{langparser::LangParser, parser::Parser};
|
||||
use parser::langparser::LangParser;
|
||||
use parser::parser::Parser;
|
||||
|
||||
use crate::parser::source::SourceFile;
|
||||
extern crate getopts;
|
||||
|
@ -20,9 +22,9 @@ fn print_usage(program: &str, opts: Options) {
|
|||
print!("{}", opts.usage(&brief));
|
||||
}
|
||||
|
||||
fn print_version()
|
||||
{
|
||||
print!("NML -- Not a Markup Language
|
||||
fn print_version() {
|
||||
print!(
|
||||
"NML -- Not a Markup Language
|
||||
Copyright (c) 2024
|
||||
NML is licensed under the GNU Affero General Public License version 3 (AGPLv3),
|
||||
under the terms of the Free Software Foundation <https://www.gnu.org/licenses/agpl-3.0.en.html>.
|
||||
|
@ -30,7 +32,8 @@ under the terms of the Free Software Foundation <https://www.gnu.org/licenses/ag
|
|||
This program is free software; you may modify and redistribute it.
|
||||
There is NO WARRANTY, to the extent permitted by law.
|
||||
|
||||
NML version: 0.4\n");
|
||||
NML version: 0.4\n"
|
||||
);
|
||||
}
|
||||
|
||||
fn main() {
|
||||
|
@ -45,11 +48,12 @@ fn main() {
|
|||
opts.optflag("v", "version", "Print program version and licenses");
|
||||
|
||||
let matches = match opts.parse(&args[1..]) {
|
||||
Ok(m) => { m }
|
||||
Err(f) => { panic!("{}", f.to_string()) }
|
||||
Ok(m) => m,
|
||||
Err(f) => {
|
||||
panic!("{}", f.to_string())
|
||||
}
|
||||
};
|
||||
if matches.opt_present("v")
|
||||
{
|
||||
if matches.opt_present("v") {
|
||||
print_version();
|
||||
return;
|
||||
}
|
||||
|
@ -72,39 +76,41 @@ fn main() {
|
|||
let source = SourceFile::new(input.to_string(), None).unwrap();
|
||||
let doc = parser.parse(Rc::new(source), None);
|
||||
|
||||
if debug_opts.contains(&"ast".to_string())
|
||||
{
|
||||
if debug_opts.contains(&"ast".to_string()) {
|
||||
println!("-- BEGIN AST DEBUGGING --");
|
||||
doc.content.borrow().iter().for_each(|elem| {
|
||||
println!("{}", (elem).to_string())
|
||||
});
|
||||
doc.content()
|
||||
.borrow()
|
||||
.iter()
|
||||
.for_each(|elem| println!("{}", (elem).to_string()));
|
||||
println!("-- END AST DEBUGGING --");
|
||||
}
|
||||
|
||||
|
||||
if debug_opts.contains(&"ref".to_string())
|
||||
{
|
||||
println!("-- BEGIN REFERENCES DEBUGGING --");
|
||||
let sc = doc.scope.borrow();
|
||||
sc.referenceable.iter().for_each(|(name, pos)| {
|
||||
println!(" - {name}: `{:#?}`", doc.content.borrow()[*pos]);
|
||||
});
|
||||
println!("-- END REFERENCES DEBUGGING --");
|
||||
}
|
||||
if debug_opts.contains(&"var".to_string())
|
||||
{
|
||||
// TODO
|
||||
//if debug_opts.contains(&"ref".to_string())
|
||||
//{
|
||||
// println!("-- BEGIN REFERENCES DEBUGGING --");
|
||||
// let sc = doc.scope.borrow();
|
||||
// sc.referenceable.iter().for_each(|(name, pos)| {
|
||||
// println!(" - {name}: `{:#?}`", doc.content.borrow()[*pos]);
|
||||
// });
|
||||
// println!("-- END REFERENCES DEBUGGING --");
|
||||
//}
|
||||
if debug_opts.contains(&"var".to_string()) {
|
||||
println!("-- BEGIN VARIABLES DEBUGGING --");
|
||||
let sc = doc.scope.borrow();
|
||||
let sc = doc.scope().borrow();
|
||||
sc.variables.iter().for_each(|(_name, var)| {
|
||||
println!(" - `{:#?}`", var);
|
||||
});
|
||||
println!("-- END VARIABLES DEBUGGING --");
|
||||
}
|
||||
|
||||
if parser.has_error() {
|
||||
println!("Compilation aborted due to errors while parsing");
|
||||
return;
|
||||
}
|
||||
|
||||
let compiler = Compiler::new(compiler::compiler::Target::HTML, db_path);
|
||||
let out = compiler.compile(&doc);
|
||||
let out = compiler.compile(doc.as_ref());
|
||||
|
||||
std::fs::write("a.html", out).unwrap();
|
||||
}
|
||||
|
||||
|
|
|
@ -1,15 +1,38 @@
|
|||
use std::{cell::{RefCell, RefMut}, collections::{HashMap, HashSet}, ops::Range, rc::Rc};
|
||||
use std::cell::RefCell;
|
||||
use std::cell::RefMut;
|
||||
use std::collections::HashMap;
|
||||
use std::collections::HashSet;
|
||||
use std::ops::Range;
|
||||
use std::rc::Rc;
|
||||
|
||||
use ariadne::{Label, Report};
|
||||
use ariadne::Label;
|
||||
use ariadne::Report;
|
||||
|
||||
use crate::{document::{document::Document, element::{ElemKind, Element}}, elements::{paragraph::Paragraph, registrar::register, text::Text}, lua::kernel::{Kernel, KernelHolder}, parser::source::{SourceFile, VirtualSource}};
|
||||
use crate::document::document::Document;
|
||||
use crate::document::document::DocumentAccessors;
|
||||
use crate::document::element::ElemKind;
|
||||
use crate::document::element::Element;
|
||||
use crate::document::langdocument::LangDocument;
|
||||
use crate::elements::paragraph::Paragraph;
|
||||
use crate::elements::registrar::register;
|
||||
use crate::elements::text::Text;
|
||||
use crate::lua::kernel::Kernel;
|
||||
use crate::lua::kernel::KernelHolder;
|
||||
use crate::parser::source::SourceFile;
|
||||
use crate::parser::source::VirtualSource;
|
||||
|
||||
use super::{parser::{Parser, ReportColors}, rule::Rule, source::{Cursor, Source, Token}, state::StateHolder, util};
|
||||
use super::parser::Parser;
|
||||
use super::parser::ReportColors;
|
||||
use super::rule::Rule;
|
||||
use super::source::Cursor;
|
||||
use super::source::Source;
|
||||
use super::source::Token;
|
||||
use super::state::StateHolder;
|
||||
use super::util;
|
||||
|
||||
/// Parser for the language
|
||||
#[derive(Debug)]
|
||||
pub struct LangParser
|
||||
{
|
||||
pub struct LangParser {
|
||||
rules: Vec<Box<dyn Rule>>,
|
||||
colors: ReportColors,
|
||||
|
||||
|
@ -19,10 +42,8 @@ pub struct LangParser
|
|||
pub kernels: RefCell<HashMap<String, Kernel>>,
|
||||
}
|
||||
|
||||
impl LangParser
|
||||
{
|
||||
pub fn default() -> Self
|
||||
{
|
||||
impl LangParser {
|
||||
pub fn default() -> Self {
|
||||
let mut s = Self {
|
||||
rules: vec![],
|
||||
colors: ReportColors::with_colors(),
|
||||
|
@ -32,24 +53,25 @@ impl LangParser
|
|||
};
|
||||
register(&mut s);
|
||||
|
||||
s.kernels.borrow_mut()
|
||||
s.kernels
|
||||
.borrow_mut()
|
||||
.insert("main".to_string(), Kernel::new(&s));
|
||||
s
|
||||
}
|
||||
|
||||
fn handle_reports<'a>(&self, _source: Rc<dyn Source>, reports: Vec<Report<'a, (Rc<dyn Source>, Range<usize>)>>)
|
||||
{
|
||||
for mut report in reports
|
||||
{
|
||||
fn handle_reports<'a>(
|
||||
&self,
|
||||
_source: Rc<dyn Source>,
|
||||
reports: Vec<Report<'a, (Rc<dyn Source>, Range<usize>)>>,
|
||||
) {
|
||||
for mut report in reports {
|
||||
let mut sources: HashSet<Rc<dyn Source>> = HashSet::new();
|
||||
fn recurse_source(sources: &mut HashSet<Rc<dyn Source>>, source: Rc<dyn Source>) {
|
||||
sources.insert(source.clone());
|
||||
match source.location()
|
||||
{
|
||||
match source.location() {
|
||||
Some(parent) => {
|
||||
let parent_source = parent.source();
|
||||
if sources.get(&parent_source).is_none()
|
||||
{
|
||||
if sources.get(&parent_source).is_none() {
|
||||
recurse_source(sources, parent_source);
|
||||
}
|
||||
}
|
||||
|
@ -61,32 +83,31 @@ impl LangParser
|
|||
recurse_source(&mut sources, label.span.0.clone());
|
||||
});
|
||||
|
||||
let cache = sources.iter()
|
||||
let cache = sources
|
||||
.iter()
|
||||
.map(|source| (source.clone(), source.content().clone()))
|
||||
.collect::<Vec<(Rc<dyn Source>, String)>>();
|
||||
|
||||
cache.iter()
|
||||
.for_each(|(source, _)| {
|
||||
if let Some (location) = source.location()
|
||||
{
|
||||
if let Some(_s) = source.downcast_ref::<SourceFile>()
|
||||
{
|
||||
cache.iter().for_each(|(source, _)| {
|
||||
if let Some(location) = source.location() {
|
||||
if let Some(_s) = source.downcast_ref::<SourceFile>() {
|
||||
report.labels.push(
|
||||
Label::new((location.source(), location.start()+1 .. location.end()))
|
||||
Label::new((location.source(), location.start() + 1..location.end()))
|
||||
.with_message("In file included from here")
|
||||
.with_order(-1)
|
||||
.with_order(-1),
|
||||
);
|
||||
};
|
||||
|
||||
if let Some(_s) = source.downcast_ref::<VirtualSource>()
|
||||
{
|
||||
let start = location.start() + (location.source().content().as_bytes()[location.start()] == '\n' as u8)
|
||||
if let Some(_s) = source.downcast_ref::<VirtualSource>() {
|
||||
let start = location.start()
|
||||
+ (location.source().content().as_bytes()[location.start()]
|
||||
== '\n' as u8)
|
||||
.then_some(1)
|
||||
.unwrap_or(0);
|
||||
report.labels.push(
|
||||
Label::new((location.source(), start .. location.end()))
|
||||
Label::new((location.source(), start..location.end()))
|
||||
.with_message("In evaluation of")
|
||||
.with_order(-1)
|
||||
.with_order(-1),
|
||||
);
|
||||
};
|
||||
}
|
||||
|
@ -96,71 +117,59 @@ impl LangParser
|
|||
}
|
||||
}
|
||||
|
||||
impl Parser for LangParser
|
||||
{
|
||||
fn colors(&self) -> &ReportColors { &self.colors }
|
||||
|
||||
fn rules(&self) -> &Vec<Box<dyn Rule>> { &self.rules }
|
||||
fn add_rule(&mut self, rule: Box<dyn Rule>, after: Option<&'static str>)
|
||||
{
|
||||
// Error on duplicate rule
|
||||
let rule_name = (*rule).name();
|
||||
self.rules.iter().for_each(|rule| {
|
||||
if (*rule).name() != rule_name { return; }
|
||||
|
||||
panic!("Attempted to introduce duplicate rule: `{rule_name}`");
|
||||
});
|
||||
|
||||
match after
|
||||
{
|
||||
Some(name) => {
|
||||
let before = self.rules.iter()
|
||||
.enumerate()
|
||||
.find(|(_pos, r)| (r).name() == name);
|
||||
|
||||
match before
|
||||
{
|
||||
Some((pos, _)) => self.rules.insert(pos+1, rule),
|
||||
_ => panic!("Unable to find rule named `{name}`, to insert rule `{}` after it", rule.name())
|
||||
}
|
||||
}
|
||||
_ => self.rules.push(rule)
|
||||
}
|
||||
impl Parser for LangParser {
|
||||
fn colors(&self) -> &ReportColors {
|
||||
&self.colors
|
||||
}
|
||||
|
||||
fn state(&self) -> std::cell::Ref<'_, StateHolder> { self.state.borrow() }
|
||||
fn state_mut(&self) -> std::cell::RefMut<'_, StateHolder> { self.state.borrow_mut() }
|
||||
fn rules(&self) -> &Vec<Box<dyn Rule>> {
|
||||
&self.rules
|
||||
}
|
||||
fn rules_mut(&mut self) -> &mut Vec<Box<dyn Rule>> {
|
||||
&mut self.rules
|
||||
}
|
||||
|
||||
fn state(&self) -> std::cell::Ref<'_, StateHolder> {
|
||||
self.state.borrow()
|
||||
}
|
||||
fn state_mut(&self) -> std::cell::RefMut<'_, StateHolder> {
|
||||
self.state.borrow_mut()
|
||||
}
|
||||
|
||||
fn has_error(&self) -> bool { *self.err_flag.borrow() }
|
||||
|
||||
/// Add an [`Element`] to the [`Document`]
|
||||
fn push<'a>(&self, doc: &'a Document<'a>, elem: Box<dyn Element>)
|
||||
{
|
||||
if elem.kind() == ElemKind::Inline || elem.kind() == ElemKind::Invisible
|
||||
{
|
||||
let mut paragraph = doc.last_element_mut::<Paragraph>(false)
|
||||
fn push<'a>(&self, doc: &dyn Document, elem: Box<dyn Element>) {
|
||||
if elem.kind() == ElemKind::Inline || elem.kind() == ElemKind::Invisible {
|
||||
let mut paragraph = doc
|
||||
.last_element_mut::<Paragraph>()
|
||||
.or_else(|| {
|
||||
doc.push(Box::new(Paragraph::new(elem.location().clone())));
|
||||
doc.last_element_mut::<Paragraph>(false)
|
||||
}).unwrap();
|
||||
doc.last_element_mut::<Paragraph>()
|
||||
})
|
||||
.unwrap();
|
||||
|
||||
paragraph.push(elem);
|
||||
}
|
||||
else
|
||||
{
|
||||
} else {
|
||||
// Process paragraph events
|
||||
if doc.last_element_mut::<Paragraph>(false)
|
||||
.is_some_and(|_| true)
|
||||
{
|
||||
self.handle_reports(doc.source(),
|
||||
self.state_mut().on_scope_end(self, &doc, super::state::Scope::PARAGRAPH));
|
||||
if doc.last_element::<Paragraph>().is_some_and(|_| true) {
|
||||
self.handle_reports(
|
||||
doc.source(),
|
||||
self.state_mut()
|
||||
.on_scope_end(self, doc, super::state::Scope::PARAGRAPH),
|
||||
);
|
||||
}
|
||||
|
||||
doc.push(elem);
|
||||
}
|
||||
}
|
||||
|
||||
fn parse<'a>(&self, source: Rc<dyn Source>, parent: Option<&'a Document<'a>>) -> Document<'a>
|
||||
{
|
||||
let doc = Document::new(source.clone(), parent);
|
||||
fn parse<'a>(
|
||||
&self,
|
||||
source: Rc<dyn Source>,
|
||||
parent: Option<&'a dyn Document<'a>>,
|
||||
) -> Box<dyn Document<'a> + 'a> {
|
||||
let doc = LangDocument::new(source.clone(), parent);
|
||||
let mut matches = Vec::new();
|
||||
for _ in 0..self.rules.len() {
|
||||
matches.push((0usize, None));
|
||||
|
@ -169,52 +178,59 @@ impl Parser for LangParser
|
|||
let content = source.content();
|
||||
let mut cursor = Cursor::new(0usize, doc.source()); // Cursor in file
|
||||
|
||||
if parent.is_some() // Terminate parent's paragraph state
|
||||
if let Some(parent) = parent
|
||||
// Terminate parent's paragraph state
|
||||
{
|
||||
self.handle_reports(parent.as_ref().unwrap().source(),
|
||||
self.state_mut().on_scope_end(self, parent.as_ref().unwrap(), super::state::Scope::PARAGRAPH));
|
||||
self.handle_reports(
|
||||
parent.source(),
|
||||
self.state_mut()
|
||||
.on_scope_end(self, parent, super::state::Scope::PARAGRAPH),
|
||||
);
|
||||
}
|
||||
|
||||
loop
|
||||
{
|
||||
loop {
|
||||
let (rule_pos, rule, match_data) = self.update_matches(&cursor, &mut matches);
|
||||
|
||||
// Unmatched content
|
||||
let text_content = util::process_text(&doc, &content.as_str()[cursor.pos..rule_pos.pos]);
|
||||
if !text_content.is_empty()
|
||||
{
|
||||
self.push(&doc, Box::new(Text::new(
|
||||
let text_content =
|
||||
util::process_text(&doc, &content.as_str()[cursor.pos..rule_pos.pos]);
|
||||
if !text_content.is_empty() {
|
||||
self.push(
|
||||
&doc,
|
||||
Box::new(Text::new(
|
||||
Token::new(cursor.pos..rule_pos.pos, source.clone()),
|
||||
text_content
|
||||
)));
|
||||
text_content,
|
||||
)),
|
||||
);
|
||||
}
|
||||
|
||||
if let Some(rule) = rule
|
||||
{
|
||||
|
||||
if let Some(rule) = rule {
|
||||
// Rule callback
|
||||
let (new_cursor, reports) = (*rule).on_match(self, &doc, rule_pos, match_data);
|
||||
let dd: &'a dyn Document = unsafe { std::mem::transmute(&doc as &dyn Document) };
|
||||
let (new_cursor, reports) = rule.on_match(self, dd, rule_pos, match_data);
|
||||
|
||||
self.handle_reports(doc.source(), reports);
|
||||
|
||||
// Advance
|
||||
cursor = new_cursor;
|
||||
}
|
||||
else // No rules left
|
||||
} else
|
||||
// No rules left
|
||||
{
|
||||
break;
|
||||
}
|
||||
}
|
||||
|
||||
// State
|
||||
self.handle_reports(doc.source(),
|
||||
self.state_mut().on_scope_end(self, &doc, super::state::Scope::DOCUMENT));
|
||||
self.handle_reports(
|
||||
doc.source(),
|
||||
self.state_mut()
|
||||
.on_scope_end(self, &doc, super::state::Scope::DOCUMENT),
|
||||
);
|
||||
|
||||
return doc;
|
||||
return Box::new(doc);
|
||||
}
|
||||
|
||||
fn parse_into<'a>(&self, source: Rc<dyn Source>, document: &'a Document<'a>)
|
||||
{
|
||||
fn parse_into<'a>(&self, source: Rc<dyn Source>, document: &'a dyn Document<'a>) {
|
||||
let mut matches = Vec::new();
|
||||
for _ in 0..self.rules.len() {
|
||||
matches.push((0usize, None));
|
||||
|
@ -223,31 +239,32 @@ impl Parser for LangParser
|
|||
let content = source.content();
|
||||
let mut cursor = Cursor::new(0usize, source.clone());
|
||||
|
||||
loop
|
||||
{
|
||||
loop {
|
||||
let (rule_pos, rule, match_data) = self.update_matches(&cursor, &mut matches);
|
||||
|
||||
// Unmatched content
|
||||
let text_content = util::process_text(&document, &content.as_str()[cursor.pos..rule_pos.pos]);
|
||||
if !text_content.is_empty()
|
||||
{
|
||||
self.push(&document, Box::new(Text::new(
|
||||
let text_content =
|
||||
util::process_text(document, &content.as_str()[cursor.pos..rule_pos.pos]);
|
||||
if !text_content.is_empty() {
|
||||
self.push(
|
||||
document,
|
||||
Box::new(Text::new(
|
||||
Token::new(cursor.pos..rule_pos.pos, source.clone()),
|
||||
text_content
|
||||
)));
|
||||
text_content,
|
||||
)),
|
||||
);
|
||||
}
|
||||
|
||||
if let Some(rule) = rule
|
||||
{
|
||||
if let Some(rule) = rule {
|
||||
// Rule callback
|
||||
let (new_cursor, reports) = (*rule).on_match(self, &document, rule_pos, match_data);
|
||||
let (new_cursor, reports) = (*rule).on_match(self, document, rule_pos, match_data);
|
||||
|
||||
self.handle_reports(document.source(), reports);
|
||||
|
||||
// Advance
|
||||
cursor = new_cursor;
|
||||
}
|
||||
else // No rules left
|
||||
} else
|
||||
// No rules left
|
||||
{
|
||||
break;
|
||||
}
|
||||
|
@ -261,19 +278,14 @@ impl Parser for LangParser
|
|||
}
|
||||
}
|
||||
|
||||
impl KernelHolder for LangParser
|
||||
{
|
||||
fn get_kernel(&self, name: &str)
|
||||
-> Option<RefMut<'_, Kernel>> {
|
||||
RefMut::filter_map(self.kernels.borrow_mut(),
|
||||
|map| map.get_mut(name)).ok()
|
||||
impl KernelHolder for LangParser {
|
||||
fn get_kernel(&self, name: &str) -> Option<RefMut<'_, Kernel>> {
|
||||
RefMut::filter_map(self.kernels.borrow_mut(), |map| map.get_mut(name)).ok()
|
||||
}
|
||||
|
||||
fn insert_kernel(&self, name: String, kernel: Kernel)
|
||||
-> RefMut<'_, Kernel> {
|
||||
fn insert_kernel(&self, name: String, kernel: Kernel) -> RefMut<'_, Kernel> {
|
||||
//TODO do not get
|
||||
self.kernels.borrow_mut()
|
||||
.insert(name.clone(), kernel);
|
||||
self.kernels.borrow_mut().insert(name.clone(), kernel);
|
||||
self.get_kernel(name.as_str()).unwrap()
|
||||
}
|
||||
}
|
||||
|
|
|
@ -1,6 +1,6 @@
|
|||
pub mod source;
|
||||
pub mod parser;
|
||||
pub mod langparser;
|
||||
pub mod parser;
|
||||
pub mod rule;
|
||||
pub mod source;
|
||||
pub mod state;
|
||||
pub mod util;
|
||||
|
|
|
@ -1,19 +1,20 @@
|
|||
use std::any::Any;
|
||||
use std::cell::{Ref, RefMut};
|
||||
use std::cell::Ref;
|
||||
use std::cell::RefMut;
|
||||
use std::rc::Rc;
|
||||
use unicode_segmentation::UnicodeSegmentation;
|
||||
|
||||
use super::rule::Rule;
|
||||
use super::source::{Cursor, Source};
|
||||
use super::source::Cursor;
|
||||
use super::source::Source;
|
||||
use super::state::StateHolder;
|
||||
use crate::document::document::Document;
|
||||
use crate::document::element::Element;
|
||||
use ariadne::Color;
|
||||
use crate::lua::kernel::KernelHolder;
|
||||
use ariadne::Color;
|
||||
|
||||
#[derive(Debug)]
|
||||
pub struct ReportColors
|
||||
{
|
||||
pub struct ReportColors {
|
||||
pub error: Color,
|
||||
pub warning: Color,
|
||||
pub info: Color,
|
||||
|
@ -40,56 +41,102 @@ impl ReportColors {
|
|||
}
|
||||
}
|
||||
|
||||
pub trait Parser: KernelHolder
|
||||
{
|
||||
pub trait Parser: KernelHolder {
|
||||
/// Gets the colors for formatting errors
|
||||
///
|
||||
/// When colors are disabled, all colors should resolve to empty string
|
||||
fn colors(&self) -> &ReportColors;
|
||||
|
||||
fn rules(&self) -> &Vec<Box<dyn Rule>>;
|
||||
fn add_rule(&mut self, rule: Box<dyn Rule>, after: Option<&'static str>);
|
||||
fn rules_mut(&mut self) -> &mut Vec<Box<dyn Rule>>;
|
||||
|
||||
fn add_rule(&mut self, rule: Box<dyn Rule>, after: Option<&'static str>) -> Result<(), String> {
|
||||
// Error on duplicate rule
|
||||
let rule_name = (*rule).name();
|
||||
if let Err(e) = self.rules().iter().try_for_each(|rule| {
|
||||
if (*rule).name() != rule_name {
|
||||
return Ok(());
|
||||
}
|
||||
|
||||
return Err(format!(
|
||||
"Attempted to introduce duplicate rule: `{rule_name}`"
|
||||
));
|
||||
}) {
|
||||
return Err(e);
|
||||
}
|
||||
|
||||
match after {
|
||||
Some(name) => {
|
||||
let before = self
|
||||
.rules()
|
||||
.iter()
|
||||
.enumerate()
|
||||
.find(|(_pos, r)| (r).name() == name);
|
||||
|
||||
match before {
|
||||
Some((pos, _)) => self.rules_mut().insert(pos + 1, rule),
|
||||
_ => {
|
||||
return Err(format!(
|
||||
"Unable to find rule named `{name}`, to insert rule `{}` after it",
|
||||
rule.name()
|
||||
))
|
||||
}
|
||||
}
|
||||
}
|
||||
_ => self.rules_mut().push(rule),
|
||||
}
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
fn state(&self) -> Ref<'_, StateHolder>;
|
||||
fn state_mut(&self) -> RefMut<'_, StateHolder>;
|
||||
|
||||
fn has_error(&self) -> bool;
|
||||
|
||||
// Update [`matches`] and returns the position of the next matched rule.
|
||||
// If rule is empty, it means that there are no rules left to parse (i.e
|
||||
// end of document).
|
||||
fn update_matches(&self, cursor: &Cursor, matches: &mut Vec<(usize, Option<Box<dyn Any>>)>)
|
||||
-> (Cursor, Option<&Box<dyn Rule>>, Option<Box<dyn Any>>)
|
||||
{
|
||||
fn update_matches(
|
||||
&self,
|
||||
cursor: &Cursor,
|
||||
matches: &mut Vec<(usize, Option<Box<dyn Any>>)>,
|
||||
) -> (Cursor, Option<&Box<dyn Rule>>, Option<Box<dyn Any>>) {
|
||||
// Update matches
|
||||
// TODO: Trivially parellalizable
|
||||
self.rules().iter().zip(matches.iter_mut()).for_each(
|
||||
|(rule, (matched_at, match_data))| {
|
||||
self.rules()
|
||||
.iter()
|
||||
.zip(matches.iter_mut())
|
||||
.for_each(|(rule, (matched_at, match_data))| {
|
||||
// Don't upate if not stepped over yet
|
||||
if *matched_at > cursor.pos { return }
|
||||
if *matched_at > cursor.pos {
|
||||
return;
|
||||
}
|
||||
|
||||
(*matched_at, *match_data) = match rule.next_match(cursor)
|
||||
{
|
||||
(*matched_at, *match_data) = match rule.next_match(cursor) {
|
||||
None => (usize::MAX, None),
|
||||
Some((mut pos, mut data)) =>
|
||||
{
|
||||
Some((mut pos, mut data)) => {
|
||||
// Check if escaped
|
||||
while pos != usize::MAX
|
||||
{
|
||||
while pos != usize::MAX {
|
||||
let content = cursor.source.content().as_str();
|
||||
let mut graphemes = content[0 .. pos].graphemes(true);
|
||||
let mut graphemes = content[0..pos].graphemes(true);
|
||||
let mut escaped = false;
|
||||
'inner: loop
|
||||
{
|
||||
'inner: loop {
|
||||
let g = graphemes.next_back();
|
||||
if !g.is_some() || g.unwrap() != "\\" { break 'inner; }
|
||||
if !g.is_some() || g.unwrap() != "\\" {
|
||||
break 'inner;
|
||||
}
|
||||
|
||||
escaped = !escaped;
|
||||
}
|
||||
if !escaped { break; }
|
||||
if !escaped {
|
||||
break;
|
||||
}
|
||||
|
||||
// Find next potential match
|
||||
(pos, data) = match rule.next_match(&cursor.at(pos+1)) {
|
||||
(pos, data) = match rule.next_match(&cursor.at(pos + 1)) {
|
||||
Some((new_pos, new_data)) => (new_pos, new_data),
|
||||
None => (usize::MAX, data) // Stop iterating
|
||||
None => (usize::MAX, data), // Stop iterating
|
||||
}
|
||||
}
|
||||
|
||||
|
@ -99,27 +146,36 @@ pub trait Parser: KernelHolder
|
|||
});
|
||||
|
||||
// Get winning match
|
||||
let (winner, (next_pos, _match_data)) = matches.iter()
|
||||
let (winner, (next_pos, _match_data)) = matches
|
||||
.iter()
|
||||
.enumerate()
|
||||
.min_by_key(|(_, (pos, _match_data))| pos).unwrap();
|
||||
if *next_pos == usize::MAX // No rule has matched
|
||||
.min_by_key(|(_, (pos, _match_data))| pos)
|
||||
.unwrap();
|
||||
if *next_pos == usize::MAX
|
||||
// No rule has matched
|
||||
{
|
||||
let content = cursor.source.content();
|
||||
// No winners, i.e no matches left
|
||||
return (cursor.at(content.len()), None, None);
|
||||
}
|
||||
|
||||
(cursor.at(*next_pos),
|
||||
(
|
||||
cursor.at(*next_pos),
|
||||
Some(&self.rules()[winner]),
|
||||
std::mem::replace(&mut matches[winner].1, None))
|
||||
std::mem::replace(&mut matches[winner].1, None),
|
||||
)
|
||||
}
|
||||
|
||||
/// Add an [`Element`] to the [`Document`]
|
||||
fn push<'a>(&self, doc: &'a Document<'a>, elem: Box<dyn Element>);
|
||||
fn push<'a>(&self, doc: &dyn Document, elem: Box<dyn Element>);
|
||||
|
||||
/// Parse [`Source`] into a new [`Document`]
|
||||
fn parse<'a>(&self, source: Rc<dyn Source>, parent: Option<&'a Document<'a>>) -> Document<'a>;
|
||||
fn parse<'a>(
|
||||
&self,
|
||||
source: Rc<dyn Source>,
|
||||
parent: Option<&'a dyn Document<'a>>,
|
||||
) -> Box<dyn Document<'a> + 'a>;
|
||||
|
||||
/// Parse [`Source`] into an already existing [`Document`]
|
||||
fn parse_into<'a>(&self, source: Rc<dyn Source>, document: &'a Document<'a>);
|
||||
fn parse_into<'a>(&self, source: Rc<dyn Source>, document: &'a dyn Document<'a>);
|
||||
}
|
||||
|
|
|
@ -1,8 +1,11 @@
|
|||
use super::parser::Parser;
|
||||
use super::source::{Cursor, Source, Token};
|
||||
use ariadne::Report;
|
||||
use mlua::{Function, Lua};
|
||||
use super::source::Cursor;
|
||||
use super::source::Source;
|
||||
use super::source::Token;
|
||||
use crate::document::document::Document;
|
||||
use ariadne::Report;
|
||||
use mlua::Function;
|
||||
use mlua::Lua;
|
||||
|
||||
use std::any::Any;
|
||||
use std::ops::Range;
|
||||
|
@ -14,13 +17,18 @@ pub trait Rule {
|
|||
/// Finds the next match starting from [`cursor`]
|
||||
fn next_match(&self, cursor: &Cursor) -> Option<(usize, Box<dyn Any>)>;
|
||||
/// Callback when rule matches
|
||||
fn on_match(&self, parser: &dyn Parser, document: &Document, cursor: Cursor, match_data: Option<Box<dyn Any>>) -> (Cursor, Vec<Report<'_, (Rc<dyn Source>, Range<usize>)>>);
|
||||
fn on_match<'a>(
|
||||
&self,
|
||||
parser: &dyn Parser,
|
||||
document: &'a (dyn Document<'a> + 'a),
|
||||
cursor: Cursor,
|
||||
match_data: Option<Box<dyn Any>>,
|
||||
) -> (Cursor, Vec<Report<'_, (Rc<dyn Source>, Range<usize>)>>);
|
||||
/// Export bindings to lua
|
||||
fn lua_bindings<'lua>(&self, _lua: &'lua Lua) -> Vec<(String, Function<'lua>)>;
|
||||
}
|
||||
|
||||
impl core::fmt::Debug for dyn Rule
|
||||
{
|
||||
impl core::fmt::Debug for dyn Rule {
|
||||
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
|
||||
write!(f, "Rule{{{}}}", self.name())
|
||||
}
|
||||
|
@ -65,53 +73,78 @@ impl<T: RegexRule> Rule for T {
|
|||
}
|
||||
*/
|
||||
|
||||
pub trait RegexRule
|
||||
{
|
||||
pub trait RegexRule {
|
||||
fn name(&self) -> &'static str;
|
||||
|
||||
/// Returns the rule's regexes
|
||||
fn regexes(&self) -> &[regex::Regex];
|
||||
|
||||
/// Callback on regex rule match
|
||||
fn on_regex_match(&self, index: usize, parser: &dyn Parser, document: &Document, token: Token, matches: regex::Captures) -> Vec<Report<'_, (Rc<dyn Source>, Range<usize>)>>;
|
||||
fn on_regex_match<'a>(
|
||||
&self,
|
||||
index: usize,
|
||||
parser: &dyn Parser,
|
||||
document: &'a (dyn Document<'a> + 'a),
|
||||
token: Token,
|
||||
matches: regex::Captures,
|
||||
) -> Vec<Report<'_, (Rc<dyn Source>, Range<usize>)>>;
|
||||
|
||||
fn lua_bindings<'lua>(&self, _lua: &'lua Lua) -> Vec<(String, Function<'lua>)>;
|
||||
}
|
||||
|
||||
impl<T: RegexRule> Rule for T {
|
||||
fn name(&self) -> &'static str { RegexRule::name(self) }
|
||||
fn name(&self) -> &'static str {
|
||||
RegexRule::name(self)
|
||||
}
|
||||
|
||||
/// Finds the next match starting from [`cursor`]
|
||||
fn next_match(&self, cursor: &Cursor)
|
||||
-> Option<(usize, Box<dyn Any>)> {
|
||||
fn next_match(&self, cursor: &Cursor) -> Option<(usize, Box<dyn Any>)> {
|
||||
let content = cursor.source.content();
|
||||
let mut found: Option<(usize, usize)> = None;
|
||||
self.regexes().iter().enumerate().for_each(|(id, re)| {
|
||||
if let Some(m) = re.find_at(content.as_str(), cursor.pos)
|
||||
{
|
||||
if let Some(m) = re.find_at(content.as_str(), cursor.pos) {
|
||||
found = found
|
||||
.and_then(|(f_pos, f_id)|
|
||||
if f_pos > m.start() { Some((m.start(), id)) } else { Some((f_pos, f_id)) } )
|
||||
.and_then(|(f_pos, f_id)| {
|
||||
if f_pos > m.start() {
|
||||
Some((m.start(), id))
|
||||
} else {
|
||||
Some((f_pos, f_id))
|
||||
}
|
||||
})
|
||||
.or(Some((m.start(), id)));
|
||||
}
|
||||
});
|
||||
|
||||
return found.map(|(pos, id)|
|
||||
(pos, Box::new(id) as Box<dyn Any>));
|
||||
return found.map(|(pos, id)| (pos, Box::new(id) as Box<dyn Any>));
|
||||
}
|
||||
|
||||
fn on_match(&self, parser: &dyn Parser, document: &Document, cursor: Cursor, match_data: Option<Box<dyn Any>>)
|
||||
-> (Cursor, Vec<Report<'_, (Rc<dyn Source>, Range<usize>)>>) {
|
||||
fn on_match<'a>(
|
||||
&self,
|
||||
parser: &dyn Parser,
|
||||
document: &'a (dyn Document<'a> + 'a),
|
||||
cursor: Cursor,
|
||||
match_data: Option<Box<dyn Any>>,
|
||||
) -> (Cursor, Vec<Report<'_, (Rc<dyn Source>, Range<usize>)>>) {
|
||||
let content = cursor.source.content();
|
||||
let index = unsafe { match_data.unwrap_unchecked().downcast::<usize>().unwrap_unchecked() };
|
||||
let index = unsafe {
|
||||
match_data
|
||||
.unwrap_unchecked()
|
||||
.downcast::<usize>()
|
||||
.unwrap_unchecked()
|
||||
};
|
||||
let re = &self.regexes()[*index];
|
||||
|
||||
let captures = re.captures_at(content.as_str(), cursor.pos).unwrap();
|
||||
let token = Token::new(captures.get(0).unwrap().range(), cursor.source.clone());
|
||||
|
||||
let token_end = token.end();
|
||||
return (cursor.at(token_end), self.on_regex_match(*index, parser, document, token, captures));
|
||||
return (
|
||||
cursor.at(token_end),
|
||||
self.on_regex_match(*index, parser, document, token, captures),
|
||||
);
|
||||
}
|
||||
|
||||
fn lua_bindings<'lua>(&self, lua: &'lua Lua) -> Vec<(String, Function<'lua>)> { self.lua_bindings(lua) }
|
||||
fn lua_bindings<'lua>(&self, lua: &'lua Lua) -> Vec<(String, Function<'lua>)> {
|
||||
self.lua_bindings(lua)
|
||||
}
|
||||
}
|
||||
|
|
|
@ -1,12 +1,13 @@
|
|||
use std::{fs, ops::Range, rc::Rc};
|
||||
use core::fmt::Debug;
|
||||
use std::fs;
|
||||
use std::ops::Range;
|
||||
use std::rc::Rc;
|
||||
|
||||
use downcast_rs::{impl_downcast, Downcast};
|
||||
use serde::{Deserialize, Serialize};
|
||||
use downcast_rs::impl_downcast;
|
||||
use downcast_rs::Downcast;
|
||||
|
||||
/// Trait for source content
|
||||
pub trait Source: Downcast
|
||||
{
|
||||
pub trait Source: Downcast {
|
||||
/// Gets the source's location
|
||||
fn location(&self) -> Option<&Token>;
|
||||
/// Gets the source's name
|
||||
|
@ -16,23 +17,20 @@ pub trait Source: Downcast
|
|||
}
|
||||
impl_downcast!(Source);
|
||||
|
||||
impl core::fmt::Display for dyn Source
|
||||
{
|
||||
impl core::fmt::Display for dyn Source {
|
||||
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
|
||||
write!(f, "{}", self.name())
|
||||
}
|
||||
}
|
||||
|
||||
impl core::fmt::Debug for dyn Source
|
||||
{
|
||||
impl core::fmt::Debug for dyn Source {
|
||||
// TODO
|
||||
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
|
||||
write!(f, "Source{{{}}}", self.name())
|
||||
}
|
||||
}
|
||||
|
||||
impl std::cmp::PartialEq for dyn Source
|
||||
{
|
||||
impl std::cmp::PartialEq for dyn Source {
|
||||
fn eq(&self, other: &Self) -> bool {
|
||||
self.name() == other.name()
|
||||
}
|
||||
|
@ -40,30 +38,29 @@ impl std::cmp::PartialEq for dyn Source
|
|||
|
||||
impl std::cmp::Eq for dyn Source {}
|
||||
|
||||
impl std::hash::Hash for dyn Source
|
||||
{
|
||||
impl std::hash::Hash for dyn Source {
|
||||
fn hash<H: std::hash::Hasher>(&self, state: &mut H) {
|
||||
self.name().hash(state)
|
||||
}
|
||||
}
|
||||
|
||||
pub struct SourceFile
|
||||
{
|
||||
pub struct SourceFile {
|
||||
location: Option<Token>,
|
||||
path: String,
|
||||
content: String,
|
||||
}
|
||||
|
||||
|
||||
impl SourceFile
|
||||
{
|
||||
impl SourceFile {
|
||||
// TODO: Create a SourceFileRegistry holding already loaded files to avoid reloading them
|
||||
pub fn new(path: String, location: Option<Token>) -> Result<Self, String>
|
||||
{
|
||||
match fs::read_to_string(&path)
|
||||
{
|
||||
Err(_) => return Err(String::from(format!("Unable to read file content: `{}`", path))),
|
||||
Ok(content) => Ok(Self{
|
||||
pub fn new(path: String, location: Option<Token>) -> Result<Self, String> {
|
||||
match fs::read_to_string(&path) {
|
||||
Err(_) => {
|
||||
return Err(String::from(format!(
|
||||
"Unable to read file content: `{}`",
|
||||
path
|
||||
)))
|
||||
}
|
||||
Ok(content) => Ok(Self {
|
||||
location,
|
||||
path,
|
||||
content,
|
||||
|
@ -71,8 +68,7 @@ impl SourceFile
|
|||
}
|
||||
}
|
||||
|
||||
pub fn with_content(path: String, content: String, location: Option<Token>) -> Self
|
||||
{
|
||||
pub fn with_content(path: String, content: String, location: Option<Token>) -> Self {
|
||||
Self {
|
||||
location: location,
|
||||
path: path,
|
||||
|
@ -81,38 +77,48 @@ impl SourceFile
|
|||
}
|
||||
}
|
||||
|
||||
impl Source for SourceFile
|
||||
{
|
||||
fn location(&self) -> Option<&Token> { self.location.as_ref() }
|
||||
fn name(&self) -> &String { &self.path }
|
||||
fn content(&self) -> &String { &self.content }
|
||||
impl Source for SourceFile {
|
||||
fn location(&self) -> Option<&Token> {
|
||||
self.location.as_ref()
|
||||
}
|
||||
fn name(&self) -> &String {
|
||||
&self.path
|
||||
}
|
||||
fn content(&self) -> &String {
|
||||
&self.content
|
||||
}
|
||||
}
|
||||
|
||||
pub struct VirtualSource
|
||||
{
|
||||
pub struct VirtualSource {
|
||||
location: Token,
|
||||
name: String,
|
||||
content: String,
|
||||
}
|
||||
|
||||
impl VirtualSource
|
||||
{
|
||||
pub fn new(location: Token, name: String, content: String) -> Self
|
||||
{
|
||||
Self { location, name, content }
|
||||
impl VirtualSource {
|
||||
pub fn new(location: Token, name: String, content: String) -> Self {
|
||||
Self {
|
||||
location,
|
||||
name,
|
||||
content,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl Source for VirtualSource
|
||||
{
|
||||
fn location(&self) -> Option<&Token> { Some(&self.location) }
|
||||
fn name(&self) -> &String { &self.name }
|
||||
fn content(&self) -> &String { &self.content }
|
||||
impl Source for VirtualSource {
|
||||
fn location(&self) -> Option<&Token> {
|
||||
Some(&self.location)
|
||||
}
|
||||
fn name(&self) -> &String {
|
||||
&self.name
|
||||
}
|
||||
fn content(&self) -> &String {
|
||||
&self.content
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Debug)]
|
||||
pub struct Cursor
|
||||
{
|
||||
pub struct Cursor {
|
||||
pub pos: usize,
|
||||
pub source: Rc<dyn Source>,
|
||||
}
|
||||
|
@ -123,8 +129,7 @@ impl Cursor {
|
|||
}
|
||||
|
||||
/// Creates [`cursor`] at [`new_pos`] in the same [`file`]
|
||||
pub fn at(&self, new_pos: usize) -> Self
|
||||
{
|
||||
pub fn at(&self, new_pos: usize) -> Self {
|
||||
Self {
|
||||
pos: new_pos,
|
||||
source: self.source.clone(),
|
||||
|
@ -132,8 +137,7 @@ impl Cursor {
|
|||
}
|
||||
}
|
||||
|
||||
impl Clone for Cursor
|
||||
{
|
||||
impl Clone for Cursor {
|
||||
fn clone(&self) -> Self {
|
||||
Self {
|
||||
pos: self.pos,
|
||||
|
@ -147,41 +151,35 @@ impl Clone for Cursor
|
|||
}
|
||||
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct Token
|
||||
{
|
||||
pub struct Token {
|
||||
pub range: Range<usize>,
|
||||
source: Rc<dyn Source>,
|
||||
}
|
||||
|
||||
impl Token
|
||||
{
|
||||
impl Token {
|
||||
pub fn new(range: Range<usize>, source: Rc<dyn Source>) -> Self {
|
||||
Self { range, source }
|
||||
}
|
||||
|
||||
pub fn source(&self) -> Rc<dyn Source>
|
||||
{
|
||||
return self.source.clone()
|
||||
pub fn source(&self) -> Rc<dyn Source> {
|
||||
return self.source.clone();
|
||||
}
|
||||
|
||||
/// Construct Token from a range
|
||||
pub fn from(start: &Cursor, end: &Cursor) -> Self
|
||||
{
|
||||
pub fn from(start: &Cursor, end: &Cursor) -> Self {
|
||||
assert!(Rc::ptr_eq(&start.source, &end.source));
|
||||
|
||||
Self {
|
||||
range: start.pos .. end.pos,
|
||||
source: start.source.clone()
|
||||
range: start.pos..end.pos,
|
||||
source: start.source.clone(),
|
||||
}
|
||||
}
|
||||
|
||||
pub fn start(&self) -> usize
|
||||
{
|
||||
pub fn start(&self) -> usize {
|
||||
return self.range.start;
|
||||
}
|
||||
|
||||
pub fn end(&self) -> usize
|
||||
{
|
||||
pub fn end(&self) -> usize {
|
||||
return self.range.end;
|
||||
}
|
||||
}
|
||||
|
|
|
@ -1,16 +1,20 @@
|
|||
use std::{cell::RefCell, collections::HashMap, ops::Range, rc::Rc};
|
||||
use std::cell::RefCell;
|
||||
use std::collections::HashMap;
|
||||
use std::ops::Range;
|
||||
use std::rc::Rc;
|
||||
|
||||
use ariadne::Report;
|
||||
use downcast_rs::{impl_downcast, Downcast};
|
||||
use downcast_rs::impl_downcast;
|
||||
use downcast_rs::Downcast;
|
||||
|
||||
use crate::document::document::Document;
|
||||
|
||||
use super::{parser::Parser, source::Source};
|
||||
use super::parser::Parser;
|
||||
use super::source::Source;
|
||||
|
||||
/// Scope for state objects
|
||||
#[derive(PartialEq, PartialOrd, Debug)]
|
||||
pub enum Scope
|
||||
{
|
||||
pub enum Scope {
|
||||
/// Global state
|
||||
GLOBAL = 0,
|
||||
/// Document-local state
|
||||
|
@ -21,18 +25,20 @@ pub enum Scope
|
|||
PARAGRAPH = 2,
|
||||
}
|
||||
|
||||
pub trait State: Downcast
|
||||
{
|
||||
pub trait State: Downcast {
|
||||
/// Returns the state's [`Scope`]
|
||||
fn scope(&self) -> Scope;
|
||||
|
||||
/// Callback called when state goes out of scope
|
||||
fn on_remove<'a>(&self, parser: &dyn Parser, document: &Document) -> Vec<Report<'a, (Rc<dyn Source>, Range<usize>)>>;
|
||||
fn on_remove<'a>(
|
||||
&self,
|
||||
parser: &dyn Parser,
|
||||
document: &dyn Document,
|
||||
) -> Vec<Report<'a, (Rc<dyn Source>, Range<usize>)>>;
|
||||
}
|
||||
impl_downcast!(State);
|
||||
|
||||
impl core::fmt::Debug for dyn State
|
||||
{
|
||||
impl core::fmt::Debug for dyn State {
|
||||
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
|
||||
write!(f, "State{{Scope: {:#?}}}", self.scope())
|
||||
}
|
||||
|
@ -40,13 +46,11 @@ impl core::fmt::Debug for dyn State
|
|||
|
||||
/// Object owning all the states
|
||||
#[derive(Debug)]
|
||||
pub struct StateHolder
|
||||
{
|
||||
data: HashMap<String, Rc<RefCell<dyn State>>>
|
||||
pub struct StateHolder {
|
||||
data: HashMap<String, Rc<RefCell<dyn State>>>,
|
||||
}
|
||||
|
||||
impl StateHolder
|
||||
{
|
||||
impl StateHolder {
|
||||
pub fn new() -> Self {
|
||||
Self {
|
||||
data: HashMap::new(),
|
||||
|
@ -54,38 +58,38 @@ impl StateHolder
|
|||
}
|
||||
|
||||
// Attempts to push [`state`]. On collision, returns an error with the already present state
|
||||
pub fn insert(&mut self, name: String, state: Rc<RefCell<dyn State>>) -> Result<Rc<RefCell<dyn State>>, Rc<RefCell<dyn State>>>
|
||||
{
|
||||
match self.data.insert(name, state.clone())
|
||||
{
|
||||
pub fn insert(
|
||||
&mut self,
|
||||
name: String,
|
||||
state: Rc<RefCell<dyn State>>,
|
||||
) -> Result<Rc<RefCell<dyn State>>, Rc<RefCell<dyn State>>> {
|
||||
match self.data.insert(name, state.clone()) {
|
||||
Some(state) => Err(state),
|
||||
_ => Ok(state)
|
||||
_ => Ok(state),
|
||||
}
|
||||
}
|
||||
|
||||
pub fn query(&self, name: &String) -> Option<Rc<RefCell<dyn State>>>
|
||||
{
|
||||
self.data
|
||||
.get(name)
|
||||
.map_or(None, |st| Some(st.clone()))
|
||||
pub fn query(&self, name: &String) -> Option<Rc<RefCell<dyn State>>> {
|
||||
self.data.get(name).map_or(None, |st| Some(st.clone()))
|
||||
}
|
||||
|
||||
pub fn on_scope_end(&mut self, parser: &dyn Parser, document: &Document, scope: Scope) -> Vec<Report<'_, (Rc<dyn Source>, Range<usize>)>>
|
||||
{
|
||||
pub fn on_scope_end(
|
||||
&mut self,
|
||||
parser: &dyn Parser,
|
||||
document: &dyn Document,
|
||||
scope: Scope,
|
||||
) -> Vec<Report<'_, (Rc<dyn Source>, Range<usize>)>> {
|
||||
let mut result = vec![];
|
||||
|
||||
self.data
|
||||
.retain(|_name, state|
|
||||
{
|
||||
if state.borrow().scope() >= scope
|
||||
{
|
||||
state.borrow().on_remove(parser, document)
|
||||
self.data.retain(|_name, state| {
|
||||
if state.borrow().scope() >= scope {
|
||||
state
|
||||
.borrow()
|
||||
.on_remove(parser, document)
|
||||
.drain(..)
|
||||
.for_each(|report| result.push(report));
|
||||
false
|
||||
}
|
||||
else
|
||||
{
|
||||
} else {
|
||||
true
|
||||
}
|
||||
});
|
||||
|
|
|
@ -1,38 +1,43 @@
|
|||
use std::collections::HashMap;
|
||||
use std::rc::Rc;
|
||||
|
||||
use unicode_segmentation::UnicodeSegmentation;
|
||||
|
||||
use crate::{document::{document::Document, element::ElemKind}, elements::paragraph::Paragraph};
|
||||
use crate::document::document::Document;
|
||||
use crate::document::document::DocumentAccessors;
|
||||
use crate::document::element::ElemKind;
|
||||
use crate::elements::paragraph::Paragraph;
|
||||
|
||||
use super::parser::Parser;
|
||||
use super::source::Source;
|
||||
|
||||
/// Processes text for escape characters and paragraphing
|
||||
pub fn process_text(document: &Document, content: &str) -> String
|
||||
{
|
||||
pub fn process_text(document: &dyn Document, content: &str) -> String {
|
||||
let mut escaped = false;
|
||||
let mut newlines = 0usize; // Consecutive newlines
|
||||
//println!("Processing: [{content}]");
|
||||
let processed = content
|
||||
.grapheme_indices(true)
|
||||
.fold((String::new(), None),
|
||||
|(mut out, prev), (_pos, g)| {
|
||||
if newlines != 0 && g != "\n"
|
||||
{
|
||||
.graphemes(true)
|
||||
.fold((String::new(), None), |(mut out, prev), g| {
|
||||
if newlines != 0 && g != "\n" {
|
||||
newlines = 0;
|
||||
|
||||
// Add a whitespace if necessary
|
||||
match out.chars().last()
|
||||
{
|
||||
match out.chars().last() {
|
||||
Some(c) => {
|
||||
// NOTE: \n is considered whitespace, so previous codepoint can be \n
|
||||
// (Which can only be done by escaping it)
|
||||
if !c.is_whitespace() || c == '\n'
|
||||
{
|
||||
if !c.is_whitespace() || c == '\n' {
|
||||
out += " ";
|
||||
}
|
||||
}
|
||||
None => {
|
||||
if document.last_element::<Paragraph>(false)
|
||||
.and_then(|par| par.find_back(|e| e.kind() != ElemKind::Invisible)
|
||||
.and_then(|e| Some(e.kind() == ElemKind::Inline)))
|
||||
if document
|
||||
.last_element::<Paragraph>()
|
||||
.and_then(|par| {
|
||||
par.find_back(|e| e.kind() != ElemKind::Invisible)
|
||||
.and_then(|e| Some(e.kind() == ElemKind::Inline))
|
||||
})
|
||||
.unwrap_or(false)
|
||||
{
|
||||
out += " ";
|
||||
|
@ -42,48 +47,42 @@ pub fn process_text(document: &Document, content: &str) -> String
|
|||
}
|
||||
|
||||
// Output grapheme literally when escaped
|
||||
if escaped
|
||||
{
|
||||
if escaped {
|
||||
escaped = false;
|
||||
return (out + g, Some(g));
|
||||
}
|
||||
// Increment newlines counter
|
||||
else if g == "\n"
|
||||
{
|
||||
else if g == "\n" {
|
||||
newlines += 1;
|
||||
return (out, Some(g));
|
||||
}
|
||||
// Determine if escaped
|
||||
else if g == "\\"
|
||||
{
|
||||
else if g == "\\" {
|
||||
escaped = !escaped;
|
||||
return (out, Some(g));
|
||||
}
|
||||
// Whitespaces
|
||||
else if g.chars().count() == 1 && g.chars().last().unwrap().is_whitespace()
|
||||
{
|
||||
else if g.chars().count() == 1 && g.chars().last().unwrap().is_whitespace() {
|
||||
// Content begins with whitespace
|
||||
if prev.is_none()
|
||||
{
|
||||
if document.last_element::<Paragraph>(false).is_some()
|
||||
{
|
||||
return (out+g, Some(g));
|
||||
}
|
||||
else
|
||||
{
|
||||
if prev.is_none() {
|
||||
if document.last_element::<Paragraph>().is_some() {
|
||||
return (out + g, Some(g));
|
||||
} else {
|
||||
return (out, Some(g));
|
||||
}
|
||||
}
|
||||
// Consecutive whitespaces are converted to a single whitespace
|
||||
else if prev.unwrap().chars().count() == 1 &&
|
||||
prev.unwrap().chars().last().unwrap().is_whitespace()
|
||||
else if prev.unwrap().chars().count() == 1
|
||||
&& prev.unwrap().chars().last().unwrap().is_whitespace()
|
||||
{
|
||||
return (out, Some(g));
|
||||
}
|
||||
}
|
||||
|
||||
return (out + g, Some(g));
|
||||
}).0.to_string();
|
||||
})
|
||||
.0
|
||||
.to_string();
|
||||
|
||||
return processed;
|
||||
}
|
||||
|
@ -94,39 +93,32 @@ pub fn process_text(document: &Document, content: &str) -> String
|
|||
/// # Example
|
||||
/// ```
|
||||
/// assert_eq!(process_escaped('\\', "%", "escaped: \\%, also escaped: \\\\\\%, untouched: \\a"),
|
||||
/// "escaped: %, also escaped: \\%, untouched \\a");
|
||||
/// "escaped: %, also escaped: \\%, untouched: \\a");
|
||||
/// ```
|
||||
pub fn process_escaped<S: AsRef<str>>(escape: char, token: &'static str, content: S) -> String
|
||||
{
|
||||
pub fn process_escaped<S: AsRef<str>>(escape: char, token: &'static str, content: S) -> String {
|
||||
let mut processed = String::new();
|
||||
let mut escaped = 0;
|
||||
let mut token_it = token.chars().peekable();
|
||||
for c in content.as_ref().chars()
|
||||
for c in content
|
||||
.as_ref()
|
||||
.chars()
|
||||
.as_str()
|
||||
.trim_start()
|
||||
.trim_end()
|
||||
.chars()
|
||||
{
|
||||
if c == escape
|
||||
{
|
||||
if c == escape {
|
||||
escaped += 1;
|
||||
}
|
||||
else if escaped % 2 == 1 && token_it.peek().map_or(false, |p| *p == c)
|
||||
{
|
||||
} else if escaped % 2 == 1 && token_it.peek().map_or(false, |p| *p == c) {
|
||||
let _ = token_it.next();
|
||||
if token_it.peek() == None
|
||||
{
|
||||
(0..((escaped-1)/2))
|
||||
.for_each(|_| processed.push(escape));
|
||||
if token_it.peek() == None {
|
||||
(0..(escaped / 2)).for_each(|_| processed.push(escape));
|
||||
escaped = 0;
|
||||
token_it = token.chars().peekable();
|
||||
processed.push_str(token);
|
||||
}
|
||||
}
|
||||
else
|
||||
{
|
||||
if escaped != 0
|
||||
{
|
||||
} else {
|
||||
if escaped != 0 {
|
||||
// Add untouched escapes
|
||||
(0..escaped).for_each(|_| processed.push('\\'));
|
||||
token_it = token.chars().peekable();
|
||||
|
@ -141,9 +133,28 @@ pub fn process_escaped<S: AsRef<str>>(escape: char, token: &'static str, content
|
|||
processed
|
||||
}
|
||||
|
||||
/// Parses source into a single paragraph
|
||||
/// If source contains anything but a single paragraph, an error is returned
|
||||
pub fn parse_paragraph<'a>(
|
||||
parser: &dyn Parser,
|
||||
source: Rc<dyn Source>,
|
||||
document: &'a dyn Document<'a>,
|
||||
) -> Result<Box<Paragraph>, &'static str> {
|
||||
let parsed = parser.parse(source.clone(), Some(document));
|
||||
if parsed.content().borrow().len() > 1 {
|
||||
return Err("Parsed document contains more than a single paragraph");
|
||||
} else if parsed.content().borrow().len() == 0 {
|
||||
return Err("Parser document is empty");
|
||||
} else if parsed.last_element::<Paragraph>().is_none() {
|
||||
return Err("Parsed element is not a paragraph");
|
||||
}
|
||||
|
||||
let paragraph = parsed.content().borrow_mut().pop().unwrap();
|
||||
Ok(paragraph.downcast::<Paragraph>().unwrap())
|
||||
}
|
||||
|
||||
#[derive(Debug)]
|
||||
pub struct Property
|
||||
{
|
||||
pub struct Property {
|
||||
required: bool,
|
||||
description: String,
|
||||
default: Option<String>,
|
||||
|
@ -151,42 +162,70 @@ pub struct Property
|
|||
|
||||
impl Property {
|
||||
pub fn new(required: bool, description: String, default: Option<String>) -> Self {
|
||||
Self { required, description, default }
|
||||
Self {
|
||||
required,
|
||||
description,
|
||||
default,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl core::fmt::Display for Property
|
||||
{
|
||||
impl core::fmt::Display for Property {
|
||||
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
|
||||
match self.default.as_ref()
|
||||
{
|
||||
None => write!(f, "{} {}",
|
||||
match self.default.as_ref() {
|
||||
None => write!(
|
||||
f,
|
||||
"{} {}",
|
||||
["[Opt]", "[Req]"][self.required as usize],
|
||||
self.description),
|
||||
Some(default) => write!(f, "{} {} (Deafult: {})",
|
||||
self.description
|
||||
),
|
||||
Some(default) => write!(
|
||||
f,
|
||||
"{} {} (Deafult: {})",
|
||||
["[Opt]", "[Req]"][self.required as usize],
|
||||
self.description,
|
||||
default)
|
||||
default
|
||||
),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Debug)]
|
||||
pub struct PropertyMap<'a>
|
||||
{
|
||||
pub(crate) properties: HashMap<String, (&'a Property, String)>
|
||||
pub enum PropertyMapError<E> {
|
||||
ParseError(E),
|
||||
NotFoundError(String),
|
||||
}
|
||||
|
||||
#[derive(Debug)]
|
||||
pub struct PropertyMap<'a> {
|
||||
pub(crate) properties: HashMap<String, (&'a Property, String)>,
|
||||
}
|
||||
|
||||
impl<'a> PropertyMap<'a> {
|
||||
pub fn new() -> Self {
|
||||
Self { properties: HashMap::new() }
|
||||
Self {
|
||||
properties: HashMap::new(),
|
||||
}
|
||||
}
|
||||
|
||||
pub fn get<T, Error, F: FnOnce(&'a Property, &String) -> Result<T, Error>>(&self, name: &str, f: F)
|
||||
-> Result<(&'a Property, T), Error> {
|
||||
let (prop, value) = self.properties.get(name).unwrap();
|
||||
pub fn get<T, Error, F: FnOnce(&'a Property, &String) -> Result<T, Error>>(
|
||||
&self,
|
||||
name: &str,
|
||||
f: F,
|
||||
) -> Result<(&'a Property, T), PropertyMapError<Error>> {
|
||||
let (prop, value) = match self.properties.get(name) {
|
||||
Some(found) => found,
|
||||
None => {
|
||||
return Err(PropertyMapError::NotFoundError(format!(
|
||||
"Property `{name}` not found"
|
||||
)))
|
||||
}
|
||||
};
|
||||
|
||||
f(prop, value).and_then(|value| Ok((*prop, value)))
|
||||
match f(prop, value) {
|
||||
Ok(parsed) => Ok((*prop, parsed)),
|
||||
Err(err) => Err(PropertyMapError::ParseError(err)),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
|
@ -195,9 +234,7 @@ pub struct PropertyParser {
|
|||
}
|
||||
|
||||
impl PropertyParser {
|
||||
pub fn new(properties: HashMap<String, Property>) -> Self {
|
||||
Self { properties }
|
||||
}
|
||||
pub fn new(properties: HashMap<String, Property>) -> Self { Self { properties } }
|
||||
|
||||
/// Attempts to build a default propertymap
|
||||
///
|
||||
|
@ -205,17 +242,14 @@ impl PropertyParser {
|
|||
pub fn default(&self) -> Result<PropertyMap<'_>, String> {
|
||||
let mut properties = PropertyMap::new();
|
||||
|
||||
for (name, prop) in &self.properties
|
||||
{
|
||||
match (prop.required, prop.default.as_ref())
|
||||
{
|
||||
for (name, prop) in &self.properties {
|
||||
match (prop.required, prop.default.as_ref()) {
|
||||
(true, None) => return Err(format!("Missing property `{name}` {prop}")),
|
||||
(false, None) => {},
|
||||
(false, None) => {}
|
||||
(_, Some(default)) => {
|
||||
properties.properties.insert(
|
||||
name.clone(),
|
||||
(prop, default.clone())
|
||||
);
|
||||
properties
|
||||
.properties
|
||||
.insert(name.clone(), (prop, default.clone()));
|
||||
}
|
||||
}
|
||||
}
|
||||
|
@ -233,13 +267,14 @@ impl PropertyParser {
|
|||
/// # Example
|
||||
///
|
||||
/// ```
|
||||
/// let properties = HashMap::new();
|
||||
/// properties.insert("width", Property::new(true, "Width of the element in em", None));
|
||||
/// let mut properties = HashMap::new();
|
||||
/// properties.insert("width".to_string(),
|
||||
/// Property::new(true, "Width of the element in em".to_string(), None));
|
||||
///
|
||||
/// let parser = PropertyParser::new(properties);
|
||||
/// let pm = parser.parse("width=15").unwrap();
|
||||
///
|
||||
/// assert!(pm.get("width", |_, val| val.parse::<i32>()) == Ok(15));
|
||||
/// assert_eq!(pm.get("width", |_, s| s.parse::<i32>()).unwrap().1, 15);
|
||||
/// ```
|
||||
/// # Return value
|
||||
///
|
||||
|
@ -251,8 +286,7 @@ impl PropertyParser {
|
|||
/// Note: Only ',' inside values can be escaped, other '\' are treated literally
|
||||
pub fn parse(&self, content: &str) -> Result<PropertyMap<'_>, String> {
|
||||
let mut properties = PropertyMap::new();
|
||||
let mut try_insert = |name: &String, value: &String|
|
||||
-> Result<(), String> {
|
||||
let mut try_insert = |name: &String, value: &String| -> Result<(), String> {
|
||||
let trimmed_name = name.trim_end().trim_start();
|
||||
let trimmed_value = value.trim_end().trim_start();
|
||||
let prop = match self.properties.get(trimmed_name)
|
||||
|
@ -263,11 +297,11 @@ impl PropertyParser {
|
|||
Some(prop) => prop
|
||||
};
|
||||
|
||||
if let Some((_, previous)) = properties.properties.insert(
|
||||
trimmed_name.to_string(),
|
||||
(prop, trimmed_value.to_string()))
|
||||
if let Some((_, previous)) = properties
|
||||
.properties
|
||||
.insert(trimmed_name.to_string(), (prop, trimmed_value.to_string()))
|
||||
{
|
||||
return Err(format!("Duplicate property `{trimmed_name}`, previous value: `{previous}` current value: `{trimmed_value}`"))
|
||||
return Err(format!("Duplicate property `{trimmed_name}`, previous value: `{previous}` current value: `{trimmed_value}`"));
|
||||
}
|
||||
|
||||
Ok(())
|
||||
|
@ -277,67 +311,213 @@ impl PropertyParser {
|
|||
let mut name = String::new();
|
||||
let mut value = String::new();
|
||||
let mut escaped = 0usize;
|
||||
for c in content.chars()
|
||||
{
|
||||
if c == '\\'
|
||||
{
|
||||
for c in content.chars() {
|
||||
if c == '\\' {
|
||||
escaped += 1;
|
||||
}
|
||||
else if c == '=' && in_name
|
||||
{
|
||||
} else if c == '=' && in_name {
|
||||
in_name = false;
|
||||
(0..escaped).for_each(|_| name.push('\\'));
|
||||
escaped = 0;
|
||||
}
|
||||
else if c == ',' && !in_name
|
||||
} else if c == ',' && !in_name {
|
||||
if escaped % 2 == 0
|
||||
// Not escaped
|
||||
{
|
||||
if escaped % 2 == 0 // Not escaped
|
||||
{
|
||||
(0..escaped/2).for_each(|_| value.push('\\'));
|
||||
(0..escaped / 2).for_each(|_| value.push('\\'));
|
||||
escaped = 0;
|
||||
in_name = true;
|
||||
|
||||
if let Err(e) = try_insert(&name, &value) {
|
||||
return Err(e)
|
||||
return Err(e);
|
||||
}
|
||||
name.clear();
|
||||
value.clear();
|
||||
}
|
||||
else
|
||||
{
|
||||
(0..(escaped-1)/2).for_each(|_| value.push('\\'));
|
||||
} else {
|
||||
(0..(escaped - 1) / 2).for_each(|_| value.push('\\'));
|
||||
value.push(',');
|
||||
escaped = 0;
|
||||
}
|
||||
}
|
||||
else
|
||||
{
|
||||
} else {
|
||||
if in_name {
|
||||
(0..escaped).for_each(|_| name.push('\\'));
|
||||
name.push(c)
|
||||
}
|
||||
else {
|
||||
} else {
|
||||
(0..escaped).for_each(|_| value.push('\\'));
|
||||
value.push(c)
|
||||
}
|
||||
escaped = 0;
|
||||
}
|
||||
}
|
||||
if !in_name && value.trim_end().trim_start().is_empty()
|
||||
{
|
||||
return Err("Expected a value after last `=`".to_string())
|
||||
}
|
||||
else if name.is_empty() || value.is_empty()
|
||||
{
|
||||
if !in_name && value.trim_end().trim_start().is_empty() {
|
||||
return Err("Expected a value after last `=`".to_string());
|
||||
} else if name.is_empty() || value.is_empty() {
|
||||
return Err("Expected non empty property list.".to_string());
|
||||
}
|
||||
|
||||
if let Err(e) = try_insert(&name, &value) {
|
||||
return Err(e)
|
||||
return Err(e);
|
||||
}
|
||||
|
||||
// TODO: Missing properties
|
||||
|
||||
if let Err(e) = self.properties.iter().try_for_each(|(key, prop)| {
|
||||
if !properties.properties.contains_key(key) {
|
||||
if let Some(default) = &prop.default {
|
||||
properties
|
||||
.properties
|
||||
.insert(key.clone(), (prop, default.clone()));
|
||||
} else if prop.required {
|
||||
return Err(format!("Missing required property: {prop}"));
|
||||
}
|
||||
}
|
||||
Ok(())
|
||||
}) {
|
||||
Err(e)
|
||||
} else {
|
||||
Ok(properties)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
use crate::document::langdocument::LangDocument;
|
||||
use crate::elements::comment::Comment;
|
||||
use crate::elements::style::Style;
|
||||
use crate::elements::text::Text;
|
||||
use crate::parser::source::SourceFile;
|
||||
use crate::parser::source::Token;
|
||||
use std::rc::Rc;
|
||||
|
||||
#[test]
|
||||
fn process_text_tests() {
|
||||
let source = Rc::new(SourceFile::with_content(
|
||||
"".to_string(),
|
||||
"".to_string(),
|
||||
None,
|
||||
));
|
||||
let doc = LangDocument::new(source.clone(), None);
|
||||
|
||||
assert_eq!(process_text(&doc, "a\nb"), "a b");
|
||||
assert_eq!(process_text(&doc, "a\n\nb"), "a b"); // Should never happen but why not
|
||||
assert_eq!(process_text(&doc, "a\\b"), "ab");
|
||||
assert_eq!(process_text(&doc, "a\\\nb"), "a\nb");
|
||||
assert_eq!(process_text(&doc, "a\\\\b"), "a\\b");
|
||||
assert_eq!(process_text(&doc, "a\\\\\nb"), "a\\ b");
|
||||
assert_eq!(process_text(&doc, "\na"), "a");
|
||||
|
||||
let tok = Token::new(0..0, source);
|
||||
doc.push(Box::new(Paragraph::new(tok.clone())));
|
||||
|
||||
// Comments are ignored (kind => Invisible)
|
||||
(&doc as &dyn Document)
|
||||
.last_element_mut::<Paragraph>()
|
||||
.unwrap()
|
||||
.push(Box::new(Comment::new(tok.clone(), "COMMENT".to_string())));
|
||||
assert_eq!(process_text(&doc, "\na"), "a");
|
||||
|
||||
// A space is appended as previous element is inline
|
||||
(&doc as &dyn Document)
|
||||
.last_element_mut::<Paragraph>()
|
||||
.unwrap()
|
||||
.push(Box::new(Text::new(tok.clone(), "TEXT".to_string())));
|
||||
assert_eq!(process_text(&doc, "\na"), " a");
|
||||
|
||||
(&doc as &dyn Document)
|
||||
.last_element_mut::<Paragraph>()
|
||||
.unwrap()
|
||||
.push(Box::new(Style::new(tok.clone(), 0, false)));
|
||||
assert_eq!(process_text(&doc, "\na"), " a");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn process_escaped_tests() {
|
||||
assert_eq!(
|
||||
process_escaped(
|
||||
'\\',
|
||||
"%",
|
||||
"escaped: \\%, also escaped: \\\\\\%, untouched: \\a"
|
||||
),
|
||||
"escaped: %, also escaped: \\%, untouched: \\a"
|
||||
);
|
||||
assert_eq!(
|
||||
process_escaped('"', "><)))°>", "Escaped fish: \"><)))°>"),
|
||||
"Escaped fish: ><)))°>".to_string()
|
||||
);
|
||||
assert_eq!(
|
||||
process_escaped('\\', "]", "Escaped \\]"),
|
||||
"Escaped ]".to_string()
|
||||
);
|
||||
assert_eq!(
|
||||
process_escaped('\\', "]", "Unescaped \\\\]"),
|
||||
"Unescaped \\\\]".to_string()
|
||||
);
|
||||
assert_eq!(
|
||||
process_escaped('\\', "]", "Escaped \\\\\\]"),
|
||||
"Escaped \\]".to_string()
|
||||
);
|
||||
assert_eq!(
|
||||
process_escaped('\\', "]", "Unescaped \\\\\\\\]"),
|
||||
"Unescaped \\\\\\\\]".to_string()
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn property_parser_tests() {
|
||||
let mut properties = HashMap::new();
|
||||
properties.insert(
|
||||
"width".to_string(),
|
||||
Property::new(true, "Width of the element in em".to_string(), None),
|
||||
);
|
||||
properties.insert(
|
||||
"length".to_string(),
|
||||
Property::new(false, "Length in cm".to_string(), None),
|
||||
);
|
||||
properties.insert(
|
||||
"angle".to_string(),
|
||||
Property::new(
|
||||
true,
|
||||
"Angle in degrees".to_string(),
|
||||
Some("180".to_string()),
|
||||
),
|
||||
);
|
||||
properties.insert(
|
||||
"weight".to_string(),
|
||||
Property::new(false, "Weight in %".to_string(), Some("0.42".to_string())),
|
||||
);
|
||||
|
||||
let parser = PropertyParser::new(properties);
|
||||
let pm = parser.parse("width=15,length=-10").unwrap();
|
||||
|
||||
// Ok
|
||||
assert_eq!(pm.get("width", |_, s| s.parse::<i32>()).unwrap().1, 15);
|
||||
assert_eq!(pm.get("length", |_, s| s.parse::<i32>()).unwrap().1, -10);
|
||||
assert_eq!(pm.get("angle", |_, s| s.parse::<f64>()).unwrap().1, 180f64);
|
||||
assert_eq!(pm.get("angle", |_, s| s.parse::<i32>()).unwrap().1, 180);
|
||||
assert_eq!(
|
||||
pm.get("weight", |_, s| s.parse::<f32>()).unwrap().1,
|
||||
0.42f32
|
||||
);
|
||||
assert_eq!(
|
||||
pm.get("weight", |_, s| s.parse::<f64>()).unwrap().1,
|
||||
0.42f64
|
||||
);
|
||||
|
||||
// Error
|
||||
assert!(pm.get("length", |_, s| s.parse::<u32>()).is_err());
|
||||
assert!(pm.get("height", |_, s| s.parse::<f64>()).is_err());
|
||||
|
||||
// Missing property
|
||||
assert!(parser.parse("length=15").is_err());
|
||||
|
||||
// Defaults
|
||||
assert!(parser.parse("width=15").is_ok());
|
||||
assert_eq!(
|
||||
parser
|
||||
.parse("width=0,weight=0.15")
|
||||
.unwrap()
|
||||
.get("weight", |_, s| s.parse::<f32>())
|
||||
.unwrap()
|
||||
.1,
|
||||
0.15f32
|
||||
);
|
||||
}
|
||||
}
|
||||
|
|
|
@ -51,9 +51,9 @@ impl Backend {
|
|||
let parser = LangParser::default();
|
||||
let doc = parser.parse(Rc::new(source), None);
|
||||
|
||||
let semantic_tokens = semantic_token_from_document(&doc);
|
||||
self.semantic_token_map
|
||||
.insert(params.uri.to_string(), semantic_tokens);
|
||||
//let semantic_tokens = semantic_token_from_document(&doc);
|
||||
//self.semantic_token_map
|
||||
// .insert(params.uri.to_string(), semantic_tokens);
|
||||
}
|
||||
}
|
||||
|
||||
|
|
110
style.css
Normal file
110
style.css
Normal file
|
@ -0,0 +1,110 @@
|
|||
body {
|
||||
background-color: #1b1b1d;
|
||||
color: #c5c5c5;
|
||||
font-family: sans-serif;
|
||||
|
||||
max-width: 90ch;
|
||||
margin: 0 auto;
|
||||
}
|
||||
|
||||
em {
|
||||
padding-left: .1em;
|
||||
padding-right: .1em;
|
||||
|
||||
border-radius: 3px;
|
||||
border: solid 1px #100c1e;
|
||||
|
||||
|
||||
color: #ffb454;
|
||||
background-color: #191f26;
|
||||
}
|
||||
|
||||
/* Styles */
|
||||
a.inline-code
|
||||
{
|
||||
padding-left: .1em;
|
||||
padding-right: .1em;
|
||||
|
||||
border-radius: 1px;
|
||||
background-color: #191f26;
|
||||
}
|
||||
|
||||
/* Code blocks */
|
||||
div.code-block-title {
|
||||
background-color: #20202a;
|
||||
padding-left: .3em;
|
||||
}
|
||||
|
||||
div.code-block-content {
|
||||
max-height: 20em;
|
||||
margin-bottom: 0.2em;
|
||||
width: auto;
|
||||
|
||||
overflow: auto;
|
||||
|
||||
background-color: #0f141a;
|
||||
}
|
||||
|
||||
div.code-block-content td {
|
||||
border: 0;
|
||||
padding: 0;
|
||||
}
|
||||
|
||||
div.code-block-content pre {
|
||||
border: 0;
|
||||
margin: 0;
|
||||
}
|
||||
|
||||
div.code-block-content .code-block-gutter {
|
||||
-moz-user-select: none;
|
||||
-ms-user-select: none;
|
||||
-webkit-user-select: none;
|
||||
user-select: none;
|
||||
|
||||
padding-left: .1em;
|
||||
padding-right: .2em;
|
||||
text-align: right;
|
||||
|
||||
border-right: solid #2a2e3e 1px;
|
||||
background: #222d3a;
|
||||
}
|
||||
|
||||
div.code-block-content .code-block-line {
|
||||
padding-left: .1em;
|
||||
}
|
||||
|
||||
/* Media */
|
||||
.media {
|
||||
max-width: 85ch;
|
||||
margin: auto;
|
||||
text-align: center;
|
||||
}
|
||||
|
||||
.medium {
|
||||
padding-top: 1em;
|
||||
display: inline-block;
|
||||
vertical-align: middle;
|
||||
margin-left: .5em;
|
||||
margin-right: .5em;
|
||||
}
|
||||
|
||||
.medium img {
|
||||
max-width: 100%;
|
||||
}
|
||||
|
||||
div.medium p.medium-refname {
|
||||
margin: 0;
|
||||
text-align: center;
|
||||
|
||||
font-weight: bold;
|
||||
color: #9424af;
|
||||
}
|
||||
|
||||
div.medium p {
|
||||
padding: 0;
|
||||
margin-top: 0;
|
||||
margin-left: 1em;
|
||||
margin-right: 1em;
|
||||
|
||||
text-align: justify;
|
||||
}
|
268
third/latex2svg
Executable file
268
third/latex2svg
Executable file
|
@ -0,0 +1,268 @@
|
|||
#!/usr/bin/env python3
|
||||
"""latex2svg
|
||||
|
||||
Read LaTeX code from stdin and render a SVG using LaTeX, dvisvgm and svgo.
|
||||
|
||||
Returns a minified SVG with `width`, `height` and `style="vertical-align:"`
|
||||
attribues whose values are in `em` units. The SVG will have (pseudo-)unique
|
||||
IDs in case more than one is used on the same HTML page.
|
||||
|
||||
Based on [original work](https://github.com/tuxu/latex2svg) by Tino Wagner.
|
||||
"""
|
||||
__version__ = '0.2.1'
|
||||
__author__ = 'Matthias C. Hormann'
|
||||
__email__ = 'mhormann@gmx.de'
|
||||
__license__ = 'MIT'
|
||||
__copyright__ = 'Copyright (c) 2022, Matthias C. Hormann'
|
||||
|
||||
import os
|
||||
import sys
|
||||
import subprocess
|
||||
import shlex
|
||||
import re
|
||||
from tempfile import TemporaryDirectory
|
||||
from ctypes.util import find_library
|
||||
|
||||
default_template = r"""
|
||||
\documentclass[{{ fontsize }}pt,preview]{standalone}
|
||||
{{ preamble }}
|
||||
\begin{document}
|
||||
\begin{preview}
|
||||
{{ code }}
|
||||
\end{preview}
|
||||
\end{document}
|
||||
"""
|
||||
|
||||
default_preamble = r"""
|
||||
\usepackage[utf8x]{inputenc}
|
||||
\usepackage{amsmath}
|
||||
\usepackage{amsfonts}
|
||||
\usepackage{amssymb}
|
||||
\usepackage{amstext}
|
||||
\usepackage{newtxtext}
|
||||
\usepackage[libertine]{newtxmath}
|
||||
% prevent errors from old font commands
|
||||
\DeclareOldFontCommand{\rm}{\normalfont\rmfamily}{\mathrm}
|
||||
\DeclareOldFontCommand{\sf}{\normalfont\sffamily}{\mathsf}
|
||||
\DeclareOldFontCommand{\tt}{\normalfont\ttfamily}{\mathtt}
|
||||
\DeclareOldFontCommand{\bf}{\normalfont\bfseries}{\mathbf}
|
||||
\DeclareOldFontCommand{\it}{\normalfont\itshape}{\mathit}
|
||||
\DeclareOldFontCommand{\sl}{\normalfont\slshape}{\@nomath\sl}
|
||||
\DeclareOldFontCommand{\sc}{\normalfont\scshape}{\@nomath\sc}
|
||||
% prevent errors from undefined shortcuts
|
||||
\newcommand{\N}{\mathbb{N}}
|
||||
\newcommand{\R}{\mathbb{R}}
|
||||
\newcommand{\Z}{\mathbb{Z}}
|
||||
"""
|
||||
|
||||
default_svgo_config = r"""
|
||||
module.exports = {
|
||||
plugins: [
|
||||
{
|
||||
// use default preset (almost)
|
||||
name: 'preset-default',
|
||||
params: {
|
||||
overrides: {
|
||||
// viewbox required to resize SVGs with CSS, disable removal
|
||||
removeViewBox: false,
|
||||
},
|
||||
},
|
||||
// enable prefixIds
|
||||
name: 'prefixIds',
|
||||
params: {
|
||||
prefix: '{{ prefix }}',
|
||||
delim: '_',
|
||||
},
|
||||
},
|
||||
],
|
||||
};
|
||||
"""
|
||||
|
||||
latex_cmd = 'latex -interaction nonstopmode -halt-on-error'
|
||||
dvisvgm_cmd = 'dvisvgm --no-fonts'
|
||||
svgo_cmd = 'svgo'
|
||||
|
||||
default_params = {
|
||||
'fontsize': 12, # TeX pt
|
||||
'template': default_template,
|
||||
'preamble': default_preamble,
|
||||
'latex_cmd': latex_cmd,
|
||||
'dvisvgm_cmd': dvisvgm_cmd,
|
||||
'svgo_cmd': svgo_cmd,
|
||||
'svgo_config': default_svgo_config,
|
||||
'libgs': None,
|
||||
}
|
||||
|
||||
|
||||
if not hasattr(os.environ, 'LIBGS') and not find_library('gs'):
|
||||
if sys.platform == 'darwin':
|
||||
# Fallback to homebrew Ghostscript on macOS
|
||||
homebrew_libgs = '/usr/local/opt/ghostscript/lib/libgs.dylib'
|
||||
if os.path.exists(homebrew_libgs):
|
||||
default_params['libgs'] = homebrew_libgs
|
||||
if not default_params['libgs']:
|
||||
print('Warning: libgs not found', file=sys.stderr)
|
||||
|
||||
|
||||
def latex2svg(code, params=default_params, working_directory=None):
|
||||
"""Convert LaTeX to SVG using dvisvgm and svgo.
|
||||
|
||||
Parameters
|
||||
----------
|
||||
code : str
|
||||
LaTeX code to render.
|
||||
params : dict
|
||||
Conversion parameters.
|
||||
working_directory : str or None
|
||||
Working directory for external commands and place for temporary files.
|
||||
|
||||
Returns
|
||||
-------
|
||||
dict
|
||||
Dictionary of SVG output and output information:
|
||||
|
||||
* `svg`: SVG data
|
||||
* `width`: image width in *em*
|
||||
* `height`: image height in *em*
|
||||
* `valign`: baseline offset in *em*
|
||||
"""
|
||||
if working_directory is None:
|
||||
with TemporaryDirectory() as tmpdir:
|
||||
return latex2svg(code, params, working_directory=tmpdir)
|
||||
|
||||
# Caution: TeX & dvisvgm work with TeX pt (1/72.27"), but we need DTP pt (1/72")
|
||||
# so we need a scaling factor for correct output sizes
|
||||
scaling = 1.00375 # (1/72)/(1/72.27)
|
||||
|
||||
fontsize = params['fontsize']
|
||||
document = code
|
||||
|
||||
with open(os.path.join(working_directory, 'code.tex'), 'w') as f:
|
||||
f.write(document)
|
||||
|
||||
# Run LaTeX and create DVI file
|
||||
try:
|
||||
ret = subprocess.run(shlex.split(params['latex_cmd']+' code.tex'),
|
||||
stdout=subprocess.PIPE, stderr=subprocess.PIPE,
|
||||
cwd=working_directory)
|
||||
ret.check_returncode()
|
||||
except FileNotFoundError:
|
||||
raise RuntimeError('latex not found')
|
||||
|
||||
# Add LIBGS to environment if supplied
|
||||
env = os.environ.copy()
|
||||
if params['libgs']:
|
||||
env['LIBGS'] = params['libgs']
|
||||
|
||||
# Convert DVI to SVG
|
||||
try:
|
||||
ret = subprocess.run(shlex.split(params['dvisvgm_cmd']+' code.dvi'),
|
||||
stdout=subprocess.PIPE, stderr=subprocess.PIPE,
|
||||
cwd=working_directory, env=env)
|
||||
ret.check_returncode()
|
||||
except FileNotFoundError:
|
||||
raise RuntimeError('dvisvgm not found')
|
||||
|
||||
# Parse dvisvgm output for size and alignment
|
||||
def get_size(output):
|
||||
regex = r'\b([0-9.]+)pt x ([0-9.]+)pt'
|
||||
match = re.search(regex, output)
|
||||
if match:
|
||||
return (float(match.group(1)) / fontsize * scaling,
|
||||
float(match.group(2)) / fontsize * scaling)
|
||||
else:
|
||||
return None, None
|
||||
|
||||
def get_measure(output, name):
|
||||
regex = r'\b%s=([0-9.e-]+)pt' % name
|
||||
match = re.search(regex, output)
|
||||
if match:
|
||||
return float(match.group(1)) / fontsize * scaling
|
||||
else:
|
||||
return None
|
||||
|
||||
output = ret.stderr.decode('utf-8')
|
||||
width, height = get_size(output)
|
||||
depth = get_measure(output, 'depth')
|
||||
|
||||
# Modify SVG attributes, to a get a self-contained, scaling SVG
|
||||
from lxml import etree
|
||||
# read SVG, discarding all comments ("<-- Generated by… -->")
|
||||
parser = etree.XMLParser(remove_comments=True)
|
||||
xml = etree.parse(os.path.join(working_directory, 'code.svg'), parser)
|
||||
svg = xml.getroot()
|
||||
svg.set('width', f'{width:.6f}em')
|
||||
svg.set('height', f'{height:.6f}em')
|
||||
svg.set('style', f'vertical-align:{-depth:.6f}em')
|
||||
xml.write(os.path.join(working_directory, 'code.svg'))
|
||||
|
||||
# Run svgo to get a minified oneliner with (pseudo-)unique Ids
|
||||
# generate random prefix using ASCII letters (ID may not start with a digit)
|
||||
import random, string
|
||||
prefix = ''.join(random.choice(string.ascii_letters) for n in range(4))
|
||||
svgo_config = (params['svgo_config']
|
||||
.replace('{{ prefix }}', prefix))
|
||||
|
||||
# write svgo params file
|
||||
with open(os.path.join(working_directory, 'svgo.config.js'), 'w') as f:
|
||||
f.write(svgo_config)
|
||||
|
||||
try:
|
||||
ret = subprocess.run(shlex.split(params['svgo_cmd']+' code.svg'),
|
||||
stdout=subprocess.PIPE, stderr=subprocess.PIPE,
|
||||
cwd=working_directory, env=env)
|
||||
ret.check_returncode()
|
||||
except FileNotFoundError:
|
||||
raise RuntimeError('svgo not found')
|
||||
|
||||
with open(os.path.join(working_directory, 'code.svg'), 'r') as f:
|
||||
svg = f.read()
|
||||
|
||||
return {'svg': svg, 'valign': round(-depth,6),
|
||||
'width': round(width,6), 'height': round(height,6)}
|
||||
|
||||
|
||||
def main():
|
||||
"""Simple command line interface to latex2svg.
|
||||
|
||||
- Read from `stdin`.
|
||||
- Write SVG to `stdout`.
|
||||
- On error: write error messages to `stderr` and return with error code.
|
||||
"""
|
||||
import json
|
||||
import argparse
|
||||
parser = argparse.ArgumentParser(description="""
|
||||
Render LaTeX code from stdin as SVG to stdout. Writes metadata (baseline
|
||||
offset, width, height in em units) into the SVG attributes.
|
||||
""")
|
||||
parser.add_argument('--version', action='version',
|
||||
version='%(prog)s {version}'.format(version=__version__))
|
||||
parser.add_argument('--preamble',
|
||||
help="LaTeX preamble code to read from file")
|
||||
parser.add_argument('--fontsize',
|
||||
help="LaTeX fontsize in pt")
|
||||
args = parser.parse_args()
|
||||
preamble = default_preamble
|
||||
if args.preamble is not None:
|
||||
with open(args.preamble) as f:
|
||||
preamble = f.read()
|
||||
fontsize = 12
|
||||
if args.fontsize is not None:
|
||||
fontsize = int(args.fontsize)
|
||||
latex = sys.stdin.read()
|
||||
try:
|
||||
params = default_params.copy()
|
||||
params['preamble'] = preamble
|
||||
params['fontsize'] = fontsize
|
||||
out = latex2svg(latex, params)
|
||||
sys.stdout.write(out['svg'])
|
||||
except subprocess.CalledProcessError as exc:
|
||||
# LaTeX prints errors on stdout instead of stderr (stderr is empty),
|
||||
# dvisvgm to stderr, so print both (to stderr)
|
||||
print(exc.output.decode('utf-8'), file=sys.stderr)
|
||||
print(exc.stderr.decode('utf-8'), file=sys.stderr)
|
||||
sys.exit(exc.returncode)
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
main()
|
Loading…
Reference in a new issue