Update 2021-07-30: this concept is the de Bruijn factor.
- spectrum of formalization, from informal (e.g., Latex) to increasing formality, like CAS and then to Proof assistants.
- "levels of zoom" as in visualization, like Google Maps but for Math
- compiling "high level concepts" (e.g., Latex) to controlled natural language to proofs
The idea of "compiling" is about making implicit concepts explicit. Also, compiling to different architectures is similar to formalizing in different logics and axioms. Unlike the visualization concept, there is not one underlying basis to specify formalization in.
What are the degrees of formalization in math?
- verbal (hallway chats, post-conference dinners, coffee breaks)
- white board or chalk board
- physical paper + pen or pencil
- typesetting: Latex or equivalents like Markdown or pretextbook.org (XML based) or pandoc; Powerpoint; Rmd, reStructuredText, MyST, jupyterbook
- semantic enrichment/decoration: sTeX, OMDoc; PDG inference rules; pf2 package (sty)
- controlled natural languages (e.g., Naproche)
- CAS verification of inference rules (e.g. Sage, SymPy)
- formal verification in a selected target logic and axioms (e.g., Lean, Isabella, COQ)
References
No comments:
Post a Comment