tech-kern archive

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index][Old Index]

Re: [ANN] Lunatik -- NetBSD kernel scripting with Lua (GSoC project results)



On Mon, Oct 11, 2010 at 11:50 PM, Matthew Mondor
<mm_lists%pulsar-zone.net@localhost> wrote:
> On Sun, 10 Oct 2010 19:45:41 -0600
> Samuel Greear <lua%evilcode.net@localhost> wrote:
>
>> I didn't like the fact that the only option for loading a script into
>> the kernel was to load the script source. I would make loading
>> pre-compiled scripts the preferential method. In fact, I would
>> probably tear eval out of the kernel lua implementation and only
>> support loading of precompiled byte-code into the kernel.
>
> If the tokenizer is considered heavy, or a potential source of exploit,
> or if scripts are expected to frequently be loaded and a peformance
> bottleneck exists, I also think that loading pre-tokenized bytecode
> would be a good idea.

No, it is not heavy (see [1]).

> However, there are several things to consider: some systems (i.e. Java)
> do important sanity checks at tokenization time.  Is this important for
> Lua?

Yes, it does important verification in the lexer/parser.

> Secondly, is the Lua bytecode using a stable, well defined instruction
> set which is unlikely to change?  Otherwise as it improves and gets
> updated any pre-tokenized scripts might need to be regenerated.  Of
> course, that's probably not an issue if everything is part of the base
> system and always get rebuilt together.

Yes, Lua bytecode is stable and uses a well-defined instruction set,
but Lua doesn't perform bytecode verification (see [2]).

> Thanks,
> --
> Matt
>

[1] http://marc.info/?l=lua-l&m=128676702329567&w=2
[2] http://marc.info/?l=lua-l&m=128676669829325&w=2

Cheers,
--
Lourival Vieira Neto


Home | Main Index | Thread Index | Old Index