SPLASH 2020
Sun 15 - Sat 21 November 2020 Online Conference
Fri 20 Nov 2020 11:20 - 11:40 at SPLASH-III - F-3B Chair(s): Yaniv David, Francisco Ferreira
Fri 20 Nov 2020 23:20 - 23:40 at SPLASH-III - F-3B Chair(s): Dimitri Racordon, Yulei Sui

Gradual typing provides a methodology to integrate static and dynamic typing, harmonizing their often
conflicting advantages in a single language. When a user wants to enjoy the advantages of static typing, most gradual languages require that they add type annotations. Many nontrivial tasks must be undertaken while adding type annotations, including understanding program behaviors and invariants. Unfortunately, if this is done incorrectly then the added type annotations can be wrong–leading to inconsistencies between the program and the type annotations. Gradual typing implementations detect such inconsistencies at runtime, raise cast errors, and generate messages. However, solely relying on such error messages for understanding and fixing inconsistencies and their resulting cast errors is often insufficient for multiple reasons. One reason is that while such messages cover inconsistencies in one execution path, fixing them often requires reconciling information from multiple paths. Another is that users may add many wrong type annotations that they later find difficult to identify and fix, when considering all added annotations.

Recent studies provide evidence that type annotations added during program migration are often wrong and that many programmers prefer compile-time warnings about wrong annotations. Motivated by these results, we develop exploratory typing to help with the static detection, understanding, and fixing of inconsistencies. The key idea of exploratory typing is that it systematically removes dynamic types and explores alternative types for static type annotations that can remedy inconsistencies. To demonstrate the feasibility of exploratory typing, we have implemented it in PyHound, which targets programs written in Reticulated Python, a gradual variant of Python. We have evaluated PyHound on a set of Python programs, and the evaluation results demonstrate that our idea can effectively detect inconsistencies in 98% of the tested programs and fix 93% of inconsistencies, significantly outperforming pytype, a widely used Python tool for enforcing type annotations.

Fri 20 Nov

Displayed time zone: Central Time (US & Canada) change

11:00 - 12:20
F-3BOOPSLA at SPLASH-III +12h
Chair(s): Yaniv David Technion, Francisco Ferreira Imperial College London
11:00
20m
Talk
Just-in-Time Learning for Bottom-Up Enumerative Synthesis
OOPSLA
Shraddha Barke University of California at San Diego, Hila Peleg University of California at San Diego, Nadia Polikarpova University of California at San Diego
Link to publication DOI Media Attached
11:20
20m
Talk
Taming Type Annotations in Gradual Typing
OOPSLA
John Peter Campora University of Louisiana at Lafayette, Sheng Chen University of Louisiana at Lafayette
Link to publication DOI Media Attached
11:40
20m
Talk
Learning Semantic Program Embeddings with Graph Interval Neural NetworkDistinguished Paper
OOPSLA
Yu Wang Nanjing University, Ke Wang Visa Research, Fengjuan Gao Nanjing University, Linzhang Wang Nanjing University
Link to publication DOI Media Attached
12:00
20m
Talk
ιDOT: A DOT Calculus with Object Initialization
OOPSLA
Ifaz Kabir University of Alberta, Yufeng Li University of Waterloo, Ondřej Lhoták University of Waterloo
Link to publication DOI Media Attached
23:00 - 00:20
F-3BOOPSLA at SPLASH-III
Chair(s): Dimitri Racordon University of Geneva, Switzerland, Yulei Sui University of Technology Sydney
23:00
20m
Talk
Just-in-Time Learning for Bottom-Up Enumerative Synthesis
OOPSLA
Shraddha Barke University of California at San Diego, Hila Peleg University of California at San Diego, Nadia Polikarpova University of California at San Diego
Link to publication DOI Media Attached
23:20
20m
Talk
Taming Type Annotations in Gradual Typing
OOPSLA
John Peter Campora University of Louisiana at Lafayette, Sheng Chen University of Louisiana at Lafayette
Link to publication DOI Media Attached
23:40
20m
Talk
Learning Semantic Program Embeddings with Graph Interval Neural NetworkDistinguished Paper
OOPSLA
Yu Wang Nanjing University, Ke Wang Visa Research, Fengjuan Gao Nanjing University, Linzhang Wang Nanjing University
Link to publication DOI Media Attached
00:00
20m
Talk
ιDOT: A DOT Calculus with Object Initialization
OOPSLA
Ifaz Kabir University of Alberta, Yufeng Li University of Waterloo, Ondřej Lhoták University of Waterloo
Link to publication DOI Media Attached