[ art / civ / cult / cyb / diy / drg / feels / layer / lit / λ / q / r / sci / sec / tech / w / zzz ] archive provided by lainchan.jp

lainchan archive - /tech/ - 35433



File: 1489229415252.png (94 KB, 220x300, lo0t.exploit-v2_buffer_overflow.png)

No.35433

The free and open community urgent needs to explore a solution for developing crypto libraries and applications based on other programming languages, and interact with protocols as strict as possible.

I'm not an opponent of the C/C++, and I'm not interested to those opponents who crusades to eliminate Unix and C for a better computing world. I don't have much complaints about all those memory errors created by C/C++.

But in some cases, the cost of a memory error is just too high: It enables mass bypass and decryption of encrypted data on the Internet. But the cost to find such an error is very cheap, they're in possession by advisory en masse.

Another source of vulnerability comes from the loose implementation of crypto protocols, such as TLS, in order to support some legacy stuff (SSLv2 handshake and EXPORT ciphers, anyone?) that almost nobody uses, combined with memory errors, it's completely a disaster.

Actual algorithms, TWOFISH or AES, have less importance. What we really need is strict implementations of common crypto protocols, with safer (in term of memory) programming languages.

There's a research project, called "Not Quite So Broken", a quite accurate summary of our crypto nowadays, is fascinating. They have written an experimental implementation of the X.509 and TLS software stack, in OCaml, also an OTR library. They dropped legacy support and interpreted protocols strictly, and they found it is still able to access with 70% of HTTPS servers. The downside is that it seems to be inactive after their paper is published and reported. The community really needs someone to pick it up and maintain it as a long time effort, seriously, have the code audited, etc, and we need more projects like this.

  No.35434

>>35433

>I'm not an opponent of the C/C++, and I'm not interested to those opponents who crusades to eliminate Unix and C for a better computing world. I don't have much complaints about all those memory errors created by C/C++.


You don't even understand the argument you're pretending to retreat from. You're just making people less likely to agree with you on principle.

>The community really needs someone to pick it up and maintain it as a long time effort, seriously, have the code audited, etc, and we need more projects like this.


Go ahead.

  No.35570

No
Also look at post quantum crypto application codecrypt

  No.36101

So, your thread is about other programming languages applied to the cryptography field or the general modeling of cryptographic protocols?

If the first is true, then I would say that: formalization is not profitable. To apply a formal proof to a piece of code much effort is required, and not many people are willing to spend so much workload on this task. We have had some great advancements on this area, specially concerning topics like dependent types and automated formal proffers (such as Isabelle/HOL) but, still, too complex for normal people.
Also, about language, most of the languages used in cryptography, at least on the primitives, is checked by hand. If you go to some mailing list like the "moden crypto"[1] you'll find threads about this. People there usually write the code in assembly.
I do agree with you, though, that some abstraction layer could help. For example, there's two main problems that could have a better focus from the community: language checking and compilers.
The language checking, such as the ones implemented in dependent-type languages (as in Idris and ATS), or external checkers, like the simplified C subsets that can be more easily parsed for checking and optimization (see OpenPAT).

If the second is true, then I would say that some efforts is already been done, like the SeLF Project:
http://www.cs.cmu.edu/~self/

But, from my (not-so-trustworthy) opinion, there would be a need to go meta on this. Not just analyze the interaction between sets, but also watch is "from above" and see what is the main model actually doing, and see if it's really optimized or is a mess. That's the point with projects evolving Modeling Languages, such as UML (unified modeling languages).

These fields are not so explored because of two main reasons: is from the basis/complicated and it's not profitable. The first refer to the point that the study of this subject is actually the basis of all other knowledge. It comes from questioning the basics of logic itself, and going on the philosophy of logics, the metalogics, and analysing not just how is it's usage on practical mathematics, but also questioning: what is mathematics? Isn't it just an abstraction? Kurt Godel is one very interesting read if you want to think about it, and also Agrippa's trilemma and constructivism (realizability).

ps. too much text. people will not even read all this soykaf... but, look, you still here!




[1] https://moderncrypto.org/