Refocusing (semantics)

From Wikipedia, the free encyclopedia

In computer science, refocusing is a program transformation used to implement a reduction semantics—i.e., a small-step operational semantics with an explicit representation of the reduction context—more efficiently. It is a step towards implementing a deterministic semantics as a deterministic abstract machine.

A small-step operational semantics defines the meaning of a given program as a sequence of one-step reductions that starts with and continues with a sequence of reducts , where :

A one-step reduction from to is achieved by

  1. locating the smallest potentially reducible term (potential redex) using a given reduction strategy, if this potential redex exists in (otherwise is irreducible); and
  2. contracting this potential redex using given contraction rules if it is an actual one (otherwise is stuck).

A reduction semantics is a small-step operational semantics with an explicit representation of the context of each potential redex.

Writing for such context, the sequence of one-step reductions above reads:

where

  1. is first decomposed into the context and a potential redex ,
  2. is contracted into the contractum ,
  3. is then recomposed around and then decomposed into the context and a potential redex ,
  4. etc.

This succession of decompositions, contractions, and recomposition is depicted as follows:

         contract               contract               contract        
       o--------->o           o--------->o           o--------->o      
      /            \         /            \         /            \     
     /    recompose \       /    recompose \       /    recompose \    
    /                \     /                \     /                \   
   / decompose        \   / decompose        \   / decompose        \  
  /                    \ /                    \ /                    \ 
 o--------------------->o--------------------->o--------------------->o
          reduce                 reduce                 reduce        

Refocusing is a deforestation of the successive reducts:[1][2]

         contract    refocus    contract    refocus    contract        
       o--------->o---------->o--------->o---------->o--------->o------
      /            \         /            \         /            \     
     /    recompose \       /    recompose \       /    recompose \    
    /                \     /                \     /                \   
   / decompose        \   / decompose        \   / decompose        \  
  /                    \ /                    \ /                    \ 
 o                      o                      o                      o

After the initial decomposition, the succession of contractions and refocusings has the structure of a deterministic abstract machine.

Background[edit]

The semantics of a programming language defines the meaning of the programs written in this programming language. Plotkin's Structural Operational Semantics is a small-step semantics where the meaning of a program is defined step by step and where each step is an elementary operation that is carried out with contraction rules.

Example[edit]

Consider the following deterministic language of arithmetic expressions over integers with additions and quotients, in the manner of Hutton's razor.[3][4]

In OCaml:

type operator = Add | Quo;;

type expression = Lit of int | Opr of expression * operator * expression;;

So is parsed as Opr (Lit 1, Add, Lit 10) and is parsed as Opr (Lit 11, Quo, Lit 2).

type value = Int of int;;

let expression_of_value (v : value) : expression = match v with Int n -> Lit n;;

The smallest potentially reducible expressions (potential redexes) are operations over values, and they are carried out with a contraction function that maps an actual redex to an expression and otherwise yields an error message:

type potential_redex = PR of value * operator * value;;

type contractum_or_error = Contractum of expression | Error of string;;

let contract (pr : potential_redex) : contractum_or_error =
  match pr with
    PR (Int n1, Add, Int n2) ->
     Contractum (Lit (n1 + n2))
  | PR (Int n1, Quo, Int n2) ->
     if n2 = 0
     then Error (string_of_int n1 ^ " / 0")
     else Contractum (Lit (n1 / n2));;

The addition of two integers is an actual redex, and so is the quotient of an integer and a nonzero integer. So for example, the expression Opr (Opr (Lit 1, Add, Lit 10), Add, Lit 100), i.e., , reduces to Opr (Lit 11, Add, Lit 100), i.e., , the expression Opr (Lit 11, Quo, Lit 2), i.e., , reduces to Lit 5, i.e., since , and the expression Opr (Opr (Lit 1, Quo, Lit 0), Add, Lit 100), i.e., , reduces to Error "1 / 0".

Say that the reduction strategy is leftmost-innermost (i.e., depth first and left to right), as captured by the following congruence rules:

                    e1 -> e1'
     ---------------------------------------
     Opr (e1, opr, e2) -> Opr (e1', opr, e2)
 
                    e2 -> e2'
 -----------------------------------------------
 Opr (Lit n1, opr, e2) -> Opr (Lit n1, opr, e2')

The following one-step reduction function implements this strategy:

let rec reduce_d (e : expression) : value_or_expression_or_stuck =
  match e with
    Lit n ->
     Value (Int n)
  | Opr (e1, opr, e2) ->
     match reduce_d e1 with
       Value v1 ->
        (match reduce_d e2 with
           Value v2 ->
            (match contract (PR (v1, opr, v2)) with
               Contractum e ->
                Expression e
             | Error s ->
                Stuck s)
         | Expression e2' ->
            Expression (Opr (expression_of_value v1, opr, e2'))
         | Stuck s ->
            Stuck s)
     | Expression e1' ->
        Expression (Opr (e1', opr, e2))
     | Stuck s ->
        Stuck s;;

In words:

  • a literal reduces to a value;
  • if the expression e1 is stuck, then so is the expression Opr (e1, opr, e2), for any expression e2;
  • if the expression e1 reduces to an expression e1', then for any expression e2, Opr (e1, opr, e2) reduces to Opr (e1', opr, e2);
  • if the expression e1 reduces to a value v1, then
    • if the expression e2 is stuck, then so is the expression Opr (e1, opr, e2);
    • if the expression e2 reduces to an expression e2', then Opr (e1, opr, e2) reduces to Opr (e1, opr, e2');
    • if the expression e2 reduces to a value v2, then Opr (e1, opr, e2) is a potential redex:
      • if this potential redex is an actual one, then contracting it yields an expression; Opr (e1, opr, e2) reduces to this expression;
      • if this potential redex is not an actual one, then Opr (e1, opr, e2) is stuck.

Evaluation is achieved by iterated reduction. It yields either a value or an error message:

type result = Normal_form of value | Wrong of string;;

let rec normalize_d (e : expression) : result =
  match reduce_d e with
    Value v ->
     Normal_form v
  | Expression e' ->
     normalize_d e'
  | Stuck s ->
     Wrong s;;

This one-step reduction function is structurally recursive. It implements a Structural Operational Semantics for this minimalistic language of arithmetic expressions with errors.

For example, the reduction function implicitly constructs the following proof tree to carry out the reduction step :

                   ------------
                   5 + 5  -> 10
              ---------------------
              1 - (5 + 5) -> 1 - 10
 -----------------------------------------------
 (1 - (5 + 5)) - (2 - 20) -> (1 - 10) - (2 - 20)

Reformatting this proof tree to emphasize the implicit decomposition yields:

                  -------------------------------------------------->
                ^                 contraction                         |
                |         -----------------------------               |
                |         5 + 5              ->      10               |
    implicit    |    ----------------------------------               | implicit
  decomposition |    1 - (5 + 5)             ->  1 - 10               | recomposition
                |   -----------------------------------------------   |
                |   (1 - (5 + 5)) - (2 - 20) -> (1 - 10) - (2 - 20)   v

A reduction semantics is a small-step operational semantics where the implicit context of a potential redex is made explicit. So one reduction step gives rise to

  1. constructing the context of the redex,
  2. contracting this redex, and
  3. recomposing the context around the contractum to yield the reduct:
  (1 - (5 + 5)) - (2 - 20)  \
 [(1 - (5 + 5)) - (2 - 20)] | explicit
 [[1 - (5 + 5)] - (2 - 20)] | decomposition
 [[1 - [5 + 5]] - (2 - 20)] /
                              contraction
 [[1 - [ 10  ]] - (2 - 20)] \
 [[1 -   10   ] - (2 - 20)] | explicit
 [(1 -   10   ) - (2 - 20)] | recomposition
  (1 -   10   ) - (2 - 20)  /

And pictorially, an arithmetic expression is evaluated in successive steps:

         contract               contract               contract        
       o--------->o           o--------->o           o--------->o      
      /            \         /            \         /            \     
     /    recompose \       /    recompose \       /    recompose \    
    /                \     /                \     /                \   
   / decompose        \   / decompose        \   / decompose        \  
  /                    \ /                    \ /                    \ 
 o--------------------->o--------------------->o--------------------->o
          reduce                 reduce                 reduce        

Transforming the one-step reduction function in Continuation-Passing Style, delimiting the continuation from type value_or_expression_or_stuck -> 'a to type value_or_expression_or_stuck -> value_or_expression_or_stuck, and splitting this delimited continuation into two (one to continue the decomposition and one to recompose, using the type isomorphism between and ) makes it simple to implement the corresponding normalization function:

type value_or_decomposition_cc =
  Val_cc of value
| Dec_cc of potential_redex * (value -> value_or_decomposition_cc) * (expression -> expression);;

let rec decompose_expression_cc (e : expression) (kd : value -> value_or_decomposition_cc) (kr : expression -> expression) : value_or_decomposition_cc =
  match e with
    Lit n ->
     kd (Int n)
  | Opr (e1, opr, e2) ->
     decompose_expression_cc
       e1
       (fun v1 ->
         decompose_expression_cc
           e2
           (fun v2 ->
             Dec_cc (PR (v1, opr, v2), kd, kr))
           (fun e2' ->
             kr (Opr (expression_of_value v1, opr, e2'))))
       (fun e1' ->
         kr (Opr (e1', opr, e2)));;

let decompose_cc (e : expression) : value_or_decomposition_cc =
  decompose_expression_cc e (fun v -> Val_cc v) (fun e' -> e');;

let rec iterate_cc_rb (vod : value_or_decomposition_cc) : result =
  match vod with
    Val_cc v ->
     Normal_form v
  | Dec_cc (pr, kd, kr) ->
     (match contract pr with
        Contractum e ->
         iterate_cc_rb (decompose_cc (kr e))
      | Error s ->     (*^^^^^^^^^^^^^^^^^*)
         Wrong s);;

let normalize_cc_rb (e : expression) : result =
  iterate_cc_rb (decompose_cc e);;

In the underlined code, the contractum is recomposed and the result is decomposed. This normalization function is said to be reduction-based because it enumerates all the reducts in the reduction sequence.

Refocusing[edit]

Extensionally, the refocusing thesis is that there is no need to reconstruct the next reduct in order to decompose it in the next reduction step. In other words, these intermediate reducts can be deforested.

Pictorially:

         contract    refocus    contract    refocus    contract        
       o--------->o---------->o--------->o---------->o--------->o------
      /            \         /            \         /            \     
     /    recompose \       /    recompose \       /    recompose \    
    /                \     /                \     /                \   
   / decompose        \   / decompose        \   / decompose        \  
  /                    \ /                    \ /                    \ 
 o                      o                      o                      o

Intensionally, the refocusing thesis is that this deforestation is achieved by continuing the decomposition over the contractum in the current context. [1] [2]

let rec iterate_cc_rf (vod : value_or_decomposition_cc) : result =
  match vod with
    Val_cc v ->
     Normal_form v
  | Dec_cc (pr, kd, kr) ->
     (match contract pr with
        Contractum e ->
         iterate_cc_rf (decompose_expression_cc e kd kr)
      | Error s ->     (*^^^^^^^^^^^^^^^^^^^^^^^^^^^^^*)
         Wrong s);;

let normalize_cc_rf (e : expression) : result =
  iterate_cc_rf (decompose_cc e);;

In the underlined code, the decomposition is continued. This normalization function is said to be reduction-free because it enumerates none of the reducts in the reduction sequence.

In practice, the two continuations are defunctionalized into traditional first-order, inside-out contexts, which yields an implementation of Felleisen and Hieb's reduction semantics, [5] a small-step semantics that was designed independently of continuations and defunctionalization, but whose representation—as illustrated here—can be obtained by CPS-transforming and defunctionalizing the representation of a Structural Operational Semantics.

The construction sketched above is completely formalized using the Coq Proof Assistant. [4]

Applications[edit]

Over the years, refocusing has been used for inter-deriving calculi and abstract machines. [6][7] Besides the CEK Machine, the Krivine machine, and the SECD machine, examples also include the chemical abstract machine and abstract machines for JavaScript. [8][9][10] Bach Poulsen and Mosses have also used refocusing for implementing Structural Operational Semantics and Modular Structural Operational Semantics. [11]

More broadly, refocusing has been used for deriving type systems and implementations for coroutines, [12] for going from type checking via reduction to type checking via evaluation, [13] for deriving a classical call-by-need sequent calculus, [14] for deriving interpretations of the gradually-typed lambda calculus, [15] and for full reduction. [16][17]

Correctness[edit]

Danvy and Nielsen stated conditions for refocusing and proved them informally. [2] Sieczkowski, Biernacka, and Biernacki formalized refocusing using the Coq Proof Assistant. [18] Bach Poulsen proved the correctness of refocusing for XSOS using rule induction. [19] Biernacka, Charatonik, and Zielińska generalized refocusing using the Coq Proof Assistant. [20] Using Agda, Swiestra proved the refocusing step as part of his formalization of the syntactic correspondence between the calculus with a normal-order reduction strategy and the Krivine machine. [21] Also using Agda, Rozowski proved the refocusing step as part of his formalization of the syntactic correspondence between the calculus with an applicative-order reduction strategy and the CEK Machine. [22]

References[edit]

  1. ^ a b Danvy, Olivier; Nielsen, Lasse R. (2001). Syntactic theories in practice. Second International Workshop on Rule-Based Programming (RULE 2001). Vol. 59. Electronic Notes in Theoretical Computer Science. doi:10.7146/brics.v9i4.21721.
  2. ^ a b c Danvy, Olivier; Nielsen, Lasse R. (2004). Refocusing in reduction semantics (Technical report). BRICS. doi:10.7146/brics.v11i26.21851. RS-04-26.
  3. ^ https://delimited-continuation.github.io/refocusing-arithmetic-expressions.ml
  4. ^ a b Danvy, Olivier (2023). A Deforestation of Reducts: Refocusing (Technical report).
  5. ^ Felleisen, Matthias; Hieb, Robert (1992). "The Revised Report on the Syntactic Theories of Sequential Control and State". Theoretical Computer Science. 103 (2): 235–271. doi:10.1016/0304-3975(92)90014-7.
  6. ^ Biernacka, Małgorzata; Danvy, Olivier (2007). "A Concrete Framework for Environment Machines". ACM Transactions on Computational Logic. 9 (1). Article #6: 1–30. doi:10.7146/brics.v13i3.21909.
  7. ^ Biernacka, Małgorzata; Danvy, Olivier (2007). "A Syntactic Correspondence between Context-Sensitive Calculi and Abstract Machines". Theoretical Computer Science. 375 (1–3): 76–108. doi:10.7146/brics.v12i22.21888.
  8. ^ Danvy, Olivier; Millikin, Kevin (2008). "A rational deconstruction of Landin's SECD machine with the J operator". Logical Methods in Computer Science. 4 (4): 1–67. arXiv:0811.3231. doi:10.2168/LMCS-4(4:12)2008. S2CID 7926360.
  9. ^ Şerbǎnuţǎ, Traian Florin; Roşu, Grigore; Meseguer, José (2009). "A rewriting logic approach to operational semantics". Information and Computation. 207 (2): 305–340. doi:10.1016/j.ic.2008.03.026. hdl:2142/11265.
  10. ^ Van Horn, David; Might, Matthew (2018). An Analytic Framework for JavaScript (Technical report).
  11. ^ Bach Poulsen, Casper; Mosses, Peter D. (2013). "Generating specialized interpreters for modular structural operational semantics". Logic-Based Program Synthesis and Transformation. Lecture Notes in Computer Science. Vol. 8901. Logic-Based Program Synthesis and Transformation, 23rd International Symposium, LOPSTR 2013. pp. 220–236. doi:10.1007/978-3-319-14125-1_13. ISBN 978-3-319-14124-4.
  12. ^ Anton, Konrad; Thiemann, Peter (2010). Typing Coroutines. Vol. 6546. Trends in Functional Programming - 11th International Symposium (TFP). pp. 16–30.
  13. ^ Sergey, Ilya (2012). Operational Aspects of Type Systems: Inter-Derivable Semantics of Type Checking and Gradual Types for Object Ownership (Thesis). KU Leuven.
  14. ^ Ariola, Zena M.; Downen, Paul; Nakata, Keiko; Saurin, Alexis; Hugo, Herbelin (2012). Classical call-by-need sequent calculi: The unity of semantic artifacts. Functional and Logic Programming, 11th International Symposium, FLOPS 2012. Springer. pp. 32–46. doi:10.1007/978-3-642-29822-6_6.
  15. ^ García-Pérez, Álvaro; Nogueira, Pablo; Sergey, Ilya (2014). "Deriving interpretations of the gradually-typed lambda calculus". Proceedings of the ACM SIGPLAN 2014 Workshop on Partial Evaluation and Program Manipulation. Partial Evaluation and Semantics-Based Program Manipulation (PEPM 2014). pp. 157–168. doi:10.1145/2543728.2543742. ISBN 9781450326193.
  16. ^ Munk, Johan (2007). A study of syntactic and semantic artifacts and its application to lambda definability, strong normalization, and weak normalization in the presence of state (Thesis). Aarhus University. doi:10.7146/brics.v15i3.21938.
  17. ^ García-Pérez, Álvaro; Nogueira, Pablo (2014). "On the syntactic and functional correspondence between hybrid (or layered) normalisers and abstract machines". Science of Computer Programming. 95 (2): 176–199. doi:10.1016/j.scico.2014.05.011.
  18. ^ Sieczkowski, Filip; Biernacka, Małgorzata; Biernacki, Dariusz (2011). "Automating Derivations of Abstract Machines from Reduction Semantics:: A Generic Formalization of Refocusing in Coq". Automating derivations of abstract machines from reduction semantics. Lecture Notes in Computer Science. Vol. 6647. Implementation and Application of Functional Languages. pp. 72–88. doi:10.1007/978-3-642-24276-2_5. ISBN 978-3-642-24275-5.
  19. ^ Bach Poulsen, Casper (2015). Extensible Transition System Semantics (Thesis). Swansea University.
  20. ^ Biernacka, Małgorzata; Charatonik, Witold; Zielińska, Klara (2017). Generalized Refocusing: From Hybrid Strategies to Abstract Machines. Vol. 84. 2nd International Conference on Formal Structures for Computation and Deduction (FSCD 2017). pp. 1–17.
  21. ^ Swierstra, Wouter (2012). "From mathematics to abstract machine: A formal derivation of an executable Krivine machine". Electronic Proceedings in Theoretical Computer Science. 76. Proceedings of the Fourth Workshop on Mathematically Structured Functional Programming (MSFP 2012): 163–177. arXiv:1202.2924. doi:10.4204/EPTCS.76.10. S2CID 14668530.
  22. ^ Rozowski, Wojciech (2021). Formally verified derivation of an executable and terminating CEK machine from call-by-value lambda-p-hat-calculus (PDF) (Thesis). University of Southampton.