Hacker News
Quadratic Micropass Type Inference
ux266478
|next
[-]
> If we could instead unify types in the order the end-user deems most important
I think the problem with this is that the desirable priority is context and programmer-dependent, and no real reason is given why ordering expressions is unacceptable. Also this approach apparently still isn't independent from the order of expressions, it just imposes additional structure on top of that with multiple inference passes, making the whole affair more rigid and more complicated. I can only guess this makes reasoning about higher order polymorphism much harder.
Having it ranked on order alone is something which is simple, easy to internalize, easy to reason about, and gives the utmost control to the programmer. For instance with the example given, this is the intuitive way I'd have constructed that function without even really thinking about it:
fn example(x) -> Point[int] {
let p = Point(x, x);
log(["origin: ", x]);
return p;
}
netting the "ideal" error.I think the problem here stems from an expectation that type inference is kind of a magic wand for getting something like dynamic typing. So then when no thought is put into the actual structure of the program WRT the type inference, what's instead gotten is an icky system that isn't Just Doing The Thing. Vibes-based-typing? If that's the goal, I wonder if it might be better served by fuzzy inference based on multiple-passes generating potential conclusions with various confidence scores.
edmundgoodman
|previous
[-]
If this system only provides benefits in the type-error path of the compiler I wonder if a traditional single-pass unification could be used for speed on the common path of code compiling without type errors, then when unification fails this slower multi-pass approach could be run on-demand to give better error reporting. This could lazily avoid the cost of the approach in most cases, and the cases in which it would be used are less latency critical anyway.
Also, I think there is a typo in one of the code blocks: ‘2 should be unified into (string, string) not just string afaict
simvux
|root
|parent
[-]
Comparative benchmarks are tricky. I considered making a simpler single-pass inference branch which is based around the same data structures to create a more one-to-one comparison. But this algorithm is rather different so porting it wasn't very straight forward. I'm currently integrating this into my real compiler so from there it'll be easier to estimate the real-world performance impact of this system.
Typo has been fixed, thanks!