The baby and the bathwater

classic Classic list List threaded Threaded
22 messages Options
12
Reply | Threaded
Open this post in threaded view
|

The baby and the bathwater

mark.reinhold
Stephen Colebourne's recent blog entry [1] contains many true statements,
along with some reasonable advice for library maintainers.  To summarize:

  - As of Java 9, with Jigsaw, there are two ways in which a library can
    be used: Either on the traditional class path, or on the newfangled
    module path.  If you maintain a library but don't modularize it then
    it can still -- unbeknownst to you -- be used on the module path as
    an automatic module.

  - When code runs on the module path there are some differences in the
    behavior of some pre-9 APIs, in particular those related to resource
    lookup.

  - As a consequence, if you maintain a library that's intended to work
    on Java 9 or later then you should test it on both the class path
    and the module path, even if you do nothing to convert your library
    to a module.  If your library doesn't work on the module path then
    you should either fix it or document that limitation.

  - If you don't modularize your library, or at least claim an automatic
    module name for it via the `Automatic-Module-Name` manifest entry,
    then you potentially block the maintainers of libraries that depend
    upon yours from modularizing their own libraries.

  - The tools that we use, and in particular Maven, could be improved.
    It's difficult to compile the classes for a modular JAR file that's
    intended to work on the class path of pre-9 releases, it's difficult
    to test a library on both the class path and the module path, and
    various Maven plugins still need to be upgraded to handle (or else
    ignore) `module-info.java` files.  (Stephen has helpfully filed
    issues in the appropriate bug trackers for some of these problems.)

  - Some old tools, bytecode libraries, and other systems fail when they
    encounter `module-info.class` files or multi-release JAR files.

From these points Stephen concludes that the module system, "as currently
designed, has 'negative benefits' for open source libraries," saying that
this is primarily because "the split (bifurcation) of the module-path
from the class-path is an absolute nightmare."

Hyperbole aside, Stephen's main complaint here is only about the need
to test a library on both the class path and the module path if it's
intended to work on Java 9 or later.  With automated testing this
shouldn't, in principle, be a huge burden, but still it's worth asking
the question: Could we have avoided the need for such dual testing if
we hadn't introduced the module path as separate from the class path?

Consider, as a thought experiment, an alternative Jigsaw design that
didn't have a separate module path, and instead treated modular JARs
on the class path as modules rather than traditional JAR files.  You
wouldn't have to dual-test if your baseline is Java 9 or later, but
if you want to support earlier releases with the same artifact then
you'd still have to test on the class path.

With the actual Jigsaw design you do need to dual-test your library
when your baseline is Java 9 or later.  There is, however, a benefit
to this: If someone uses your library in an application that works on
the Java 8 class path today then they can migrate it to the Java 9 (or
later) class path and then, when they're ready, move your library (and
perhaps some others) over to the module path.  (There were many other
reasons to define the module path as separate from the class path, but
those aren't directly relevant here.)

The tradeoff, then, is a little bit more dual testing on the part of
library maintainers in exchange for greater flexibility for those who
will migrate existing applications to Java 9 or later releases.  Many
library maintainers will be reluctant to baseline to Java 9 (or later)
for a while yet, so they'll be dual-testing anyway, so I think this
was the right tradeoff.

Stephen closes with a specific suggestion:

  "There needs to be a way for a library author to insist that the
   modular jar file they are producing can only be run on the module-path
   (with any attempt to use it on the class-path preventing application
   startup).  This would eliminate the need for testing both class-path
   and module-path."

Yes, this would eliminate the need for dual testing, but only if you're
willing to baseline to Java 11.  As with a unified class/module path,
however, if you want your library to work on earlier releases then you'd
also, still, have to test on the class path, and you'd make it harder for
application maintainers to migrate old class-path applications.  I don't
think this idea is worth pursuing.

What ideas are worth pursuing?  We should, by all means, continue to
improve our tools, Jigsaw itself, and the rest of the JDK.  Several of
us here collaborated on the initial support for modular development in
Maven, but clearly there's more to do.  If nothing else, the Surefire
plugin should be able to test a library on both the class path and the
module path.  There's also at least one improvement in the JDK worth
considering, which a few of us have already discussed, namely a small
enhancement to javac that would allow a single invocation to compile a
module, in module mode [2], but target class files other than
`module-info.class` to an earlier release [3].

To sum up: We have more work to do, based on practical experience.  This
shouldn't be a surprise, with a change to the platform of this scope.
Let's keep doing that work, let's not unduly alarm ourselves or anyone
else, and please let's not throw out the baby with the bathwater.

- Mark


[1] http://blog.joda.org/2018/03/jpms-negative-benefits.html
[2] http://openjdk.java.net/jeps/261#Compile-time
[3] https://bugs.openjdk.java.net/browse/JDK-8200254
Reply | Threaded
Open this post in threaded view
|

Re: The baby and the bathwater

Stephen Colebourne
On 26 March 2018 at 19:08,  <[hidden email]> wrote:
> Stephen Colebourne's recent blog entry

Thanks for the thoughtful reply, of which I agree with much of it.

> Stephen's main complaint here is only about the need
> to test a library on both the class path and the module path if it's
> intended to work on Java 9 or later.  With automated testing this
> shouldn't, in principle, be a huge burden,

To a degree this depends on the size of your test suite. Some suites
are large, and running the entire suite twice in continuous
integration could be onerous.

I think the main complaint however is more subtle than a need to test
twice. It is the need to test twice _forevermore_.

ie. if this were just a transition phase, which would pass when Java
11 is the baseline, the situation would be painful but manageable. But
as things stand, there is no future time when my module will be
guaranteed to be treated as a module.

> Stephen closes with a specific suggestion:
> [snip]
> Yes, this would eliminate the need for dual testing, but only if you're
> willing to baseline to Java 11.

And that is exactly the point. At some point, Java 11 will be the new
baseline. The trade-offs jigsaw chose for the Java 8 to 9 transition
will at some point not be the right ones for the long term. There has
to be a time when a library developer can rely on strong encapsulation
and when the class-path can't just be used to completely nullify
module-info.java. Otherwise, whats the point in modularisation?

I'm arguing that it should be the module authors choice to apply the
tougher rules, but I'm very happy to hear other alternatives in the
problem-space.

> There's also at least one improvement in the JDK worth
> considering, which a few of us have already discussed, namely a small
> enhancement to javac that would allow a single invocation to compile a
> module, in module mode [2], but target class files other than
> `module-info.class` to an earlier release [3].

+1

Stephen
Reply | Threaded
Open this post in threaded view
|

Re: The baby and the bathwater

Cédric Champeau
Dual testing is a minimum. In practice, it depends on the kind of tests.
Typically, before JDK 9 for unit tests you never needed a jar to execute
unit tests. Maven happens to built it, but in practice a class directory +
resources is enough (what Gradle does when it knows a jar is not required).
For integration or functional tests, you need the jar though, which means
there are more combinations to test (class directory, jar on classpath, jar
on module path, different runtime, ...). This is not necessarily a problem,
we have the tools to do this, but the setup might not be super convenient.

2018-03-26 23:57 GMT+02:00 Stephen Colebourne <[hidden email]>:

> On 26 March 2018 at 19:08,  <[hidden email]> wrote:
> > Stephen Colebourne's recent blog entry
>
> Thanks for the thoughtful reply, of which I agree with much of it.
>
> > Stephen's main complaint here is only about the need
> > to test a library on both the class path and the module path if it's
> > intended to work on Java 9 or later.  With automated testing this
> > shouldn't, in principle, be a huge burden,
>
> To a degree this depends on the size of your test suite. Some suites
> are large, and running the entire suite twice in continuous
> integration could be onerous.
>
> I think the main complaint however is more subtle than a need to test
> twice. It is the need to test twice _forevermore_.
>
> ie. if this were just a transition phase, which would pass when Java
> 11 is the baseline, the situation would be painful but manageable. But
> as things stand, there is no future time when my module will be
> guaranteed to be treated as a module.
>
> > Stephen closes with a specific suggestion:
> > [snip]
> > Yes, this would eliminate the need for dual testing, but only if you're
> > willing to baseline to Java 11.
>
> And that is exactly the point. At some point, Java 11 will be the new
> baseline. The trade-offs jigsaw chose for the Java 8 to 9 transition
> will at some point not be the right ones for the long term. There has
> to be a time when a library developer can rely on strong encapsulation
> and when the class-path can't just be used to completely nullify
> module-info.java. Otherwise, whats the point in modularisation?
>
> I'm arguing that it should be the module authors choice to apply the
> tougher rules, but I'm very happy to hear other alternatives in the
> problem-space.
>
> > There's also at least one improvement in the JDK worth
> > considering, which a few of us have already discussed, namely a small
> > enhancement to javac that would allow a single invocation to compile a
> > module, in module mode [2], but target class files other than
> > `module-info.class` to an earlier release [3].
>
> +1
>
> Stephen
>
Reply | Threaded
Open this post in threaded view
|

Re: The baby and the bathwater

Alan Bateman
On 27/03/2018 08:15, Cédric Champeau wrote:
> Dual testing is a minimum. In practice, it depends on the kind of tests.
> Typically, before JDK 9 for unit tests you never needed a jar to execute
> unit tests. Maven happens to built it, but in practice a class directory +
> resources is enough
This hasn't changed. You can put directories containing the test classes
+ resources on the class path as before. When testing modules you can
patch a module to add the test classes (and resources) that are compiled
into a directory, no need for either the module or the tests to be
packaged as JAR files.

Maybe your comment is about testing libraries that are Multi-Release JARs?

-Alan.
Reply | Threaded
Open this post in threaded view
|

Re: The baby and the bathwater

Cédric Champeau
Yes, precisely. It's not because your library works when you use class
directory + resources, that once packaged, it still works. And since
there's a recommendation to use mrjars to package module-info.class in the
case of a library targetting both classpath (pre java 9 and java 9) and
module path (java 9+), it's important to check both. Before, you could
afford unit testing only with the class directory variant, but that's not
so evident anymore.

2018-03-27 10:29 GMT+02:00 Alan Bateman <[hidden email]>:

> On 27/03/2018 08:15, Cédric Champeau wrote:
>
>> Dual testing is a minimum. In practice, it depends on the kind of tests.
>> Typically, before JDK 9 for unit tests you never needed a jar to execute
>> unit tests. Maven happens to built it, but in practice a class directory +
>> resources is enough
>>
> This hasn't changed. You can put directories containing the test classes +
> resources on the class path as before. When testing modules you can patch a
> module to add the test classes (and resources) that are compiled into a
> directory, no need for either the module or the tests to be packaged as JAR
> files.
>
> Maybe your comment is about testing libraries that are Multi-Release JARs?
>
> -Alan.
>
Reply | Threaded
Open this post in threaded view
|

Re: The baby and the bathwater

Remi Forax
In reply to this post by Alan Bateman


----- Mail original -----
> De: "Alan Bateman" <[hidden email]>
> À: "Cédric Champeau" <[hidden email]>
> Cc: "jigsaw-dev" <[hidden email]>
> Envoyé: Mardi 27 Mars 2018 10:29:36
> Objet: Re: The baby and the bathwater

> On 27/03/2018 08:15, Cédric Champeau wrote:
>> Dual testing is a minimum. In practice, it depends on the kind of tests.
>> Typically, before JDK 9 for unit tests you never needed a jar to execute
>> unit tests. Maven happens to built it, but in practice a class directory +
>> resources is enough
> This hasn't changed. You can put directories containing the test classes
> + resources on the class path as before. When testing modules you can
> patch a module to add the test classes (and resources) that are compiled
> into a directory, no need for either the module or the tests to be
> packaged as JAR files.

with the limitation that you can not patch a module-info so if you have testing-only dependencies like JUnit and you want to run them in module-mode, you have to generate a jar.

>
> Maybe your comment is about testing libraries that are Multi-Release JARs?
>
> -Alan.

Rémi
Reply | Threaded
Open this post in threaded view
|

Re: The baby and the bathwater

Alan Bateman
On 27/03/2018 10:04, Remi Forax wrote:
> :
> with the limitation that you can not patch a module-info so if you have testing-only dependencies like JUnit and you want to run them in module-mode, you have to generate a jar.
>
The --add-reads option is used to augment the module to read junit or
other modules that are only required when testing. It works with
exploded modules. I can't think of any limitations or differences
between exploded and packaged modules to explain your comment.

Also Cédric's comment about putting the module-info.class into a
versioned section of a MR JAR is not something that has been recommended
here. He might be running into issues with class path scanning tools
that can't handle v53.0 class files or assume "module-info" is a valid
class name. Those same tools may still have issues with .class files in
META-INF/versions of course.

-Alan
Reply | Threaded
Open this post in threaded view
|

Re: The baby and the bathwater

Gregg Wonderly
In reply to this post by mark.reinhold
I think that Stephen is largely announcing that JigSaw presents more problems than it solves for the community.  My list of issues, which I have shared before, goes basically like this.

1. Modules should have versions which new tooling on community toolsets could use for managing details like maven is doing now, but without explicit version details available at runtime.  There should be runtime version details available.  Java’s dynamic binding mechanisms would allow a lot more flexibility in how software was assembled at runtime, and that would be a good thing.
2. The complete set of changes around JigSaw are primarily about isolating the community from the innards of the JDK implementation, which they developed reliance on due to the JCM not functioning and not caring enough about actual software development, but money making instead.  Now, they have to completely rearchitect large amounts of software that is mostly 100% reliable, to use these JDK “improvements”.  What’s the chance of that happening for anything that is not going to be actively developed because it completely works as needed?
3. It seems from the outside, that there was little attention payed to the 80% of Java users who don’t write code or use enterprise services, but instead have Java applets, WebStart clients and clickable jars, until late in the game.  This caused a derail of the projects desire to lock-it-down in JDK-9.  Now that the realities of all of this are becoming much more obvious, it’s actually much clearer, that a lot of java code runs without deployment plans or explicit lifecycle management of neither the Java software nor the JDK version used to run it.  The result is that more and more people are going to give up on using Java because they will try to use an application which has always worked for them, and it will either silently fail (clickable jar with a security violation), or scream about some kind of security violation (applet or webstart app) which will alarm them into deleting it or moving on.
4. When I was a member of the Sun Developer Advisory Council in the early 2000’s, I learned a lot about how Sun was focused on just the enterprise opportunity to sell hardware and support contracts.  How did that work out?  Oracle who is also a “server” company, demonstrated again, with JigSaw, that only the enterprise platform environment with planned software schedules and controlled releases was interesting.  In SDAC parties with James Gosling, we heard stories about his frustration with the desktop environment being ignored by Sun Management.  There were a lot of “I’m sorry” and “I’m trying to work on those things” comments.  Where’s Gosling now in Java involvement?
5. The AWT and Swing parts of Java as well as DOM integration with Applets hasn’t been touched or worked on in decades.  The demise of the Netbeans team and other changes around various platforms and people who were serving communities outside of the enterprise platform, demonstrates that the platform is no longer viable for anything except enterprise class projects, and I’d suggest that even there, it’s questionable because of how these JDK changes are changing things that Oracle needs, but not “providing” things that the community benefits from, except for Oracle’s argument that now the JDK can actually be improved.

I know this sound extremely whiny and fussy.  I understand how it can easily be ignored.  The detail for me, is simply that after 20 years of Java, with the first 5-10 years being very exciting because of the opportunity for unifying platforms, we are now not doing that at all.  Apple has moved on with Swift, so that platform is not interested in Java by and large.  Windows has .Net, like it or not, and people use that platform because of the rich end to end integration it has with all aspects of the platform, unlike Java which is not completely focused on ignoring anything about the UX for desktop environments.  Thus, we are left with “Linux/BSD/Solaris” as the only place that Java is being used, to some degree.  Python has a lot of traction because it is lighter weight and has better platform integration.

It’s sad to say these things, because Java could of been so much more.  But, instead, it’s getting to be less and less to fewer and fewer people, largely because the only focus is on the 10,000 customers who do enterprise computing, instead of the 10,000,000 developers who could really benefit from using Java.  That was on a slide at the first SDAC meeting.  

Java was going to be for everyone!

Gregg

> On Mar 26, 2018, at 1:08 PM, [hidden email] wrote:
>
> Stephen Colebourne's recent blog entry [1] contains many true statements,
> along with some reasonable advice for library maintainers.  To summarize:
>
>  - As of Java 9, with Jigsaw, there are two ways in which a library can
>    be used: Either on the traditional class path, or on the newfangled
>    module path.  If you maintain a library but don't modularize it then
>    it can still -- unbeknownst to you -- be used on the module path as
>    an automatic module.
>
>  - When code runs on the module path there are some differences in the
>    behavior of some pre-9 APIs, in particular those related to resource
>    lookup.
>
>  - As a consequence, if you maintain a library that's intended to work
>    on Java 9 or later then you should test it on both the class path
>    and the module path, even if you do nothing to convert your library
>    to a module.  If your library doesn't work on the module path then
>    you should either fix it or document that limitation.
>
>  - If you don't modularize your library, or at least claim an automatic
>    module name for it via the `Automatic-Module-Name` manifest entry,
>    then you potentially block the maintainers of libraries that depend
>    upon yours from modularizing their own libraries.
>
>  - The tools that we use, and in particular Maven, could be improved.
>    It's difficult to compile the classes for a modular JAR file that's
>    intended to work on the class path of pre-9 releases, it's difficult
>    to test a library on both the class path and the module path, and
>    various Maven plugins still need to be upgraded to handle (or else
>    ignore) `module-info.java` files.  (Stephen has helpfully filed
>    issues in the appropriate bug trackers for some of these problems.)
>
>  - Some old tools, bytecode libraries, and other systems fail when they
>    encounter `module-info.class` files or multi-release JAR files.
>
> From these points Stephen concludes that the module system, "as currently
> designed, has 'negative benefits' for open source libraries," saying that
> this is primarily because "the split (bifurcation) of the module-path
> from the class-path is an absolute nightmare."
>
> Hyperbole aside, Stephen's main complaint here is only about the need
> to test a library on both the class path and the module path if it's
> intended to work on Java 9 or later.  With automated testing this
> shouldn't, in principle, be a huge burden, but still it's worth asking
> the question: Could we have avoided the need for such dual testing if
> we hadn't introduced the module path as separate from the class path?
>
> Consider, as a thought experiment, an alternative Jigsaw design that
> didn't have a separate module path, and instead treated modular JARs
> on the class path as modules rather than traditional JAR files.  You
> wouldn't have to dual-test if your baseline is Java 9 or later, but
> if you want to support earlier releases with the same artifact then
> you'd still have to test on the class path.
>
> With the actual Jigsaw design you do need to dual-test your library
> when your baseline is Java 9 or later.  There is, however, a benefit
> to this: If someone uses your library in an application that works on
> the Java 8 class path today then they can migrate it to the Java 9 (or
> later) class path and then, when they're ready, move your library (and
> perhaps some others) over to the module path.  (There were many other
> reasons to define the module path as separate from the class path, but
> those aren't directly relevant here.)
>
> The tradeoff, then, is a little bit more dual testing on the part of
> library maintainers in exchange for greater flexibility for those who
> will migrate existing applications to Java 9 or later releases.  Many
> library maintainers will be reluctant to baseline to Java 9 (or later)
> for a while yet, so they'll be dual-testing anyway, so I think this
> was the right tradeoff.
>
> Stephen closes with a specific suggestion:
>
>  "There needs to be a way for a library author to insist that the
>   modular jar file they are producing can only be run on the module-path
>   (with any attempt to use it on the class-path preventing application
>   startup).  This would eliminate the need for testing both class-path
>   and module-path."
>
> Yes, this would eliminate the need for dual testing, but only if you're
> willing to baseline to Java 11.  As with a unified class/module path,
> however, if you want your library to work on earlier releases then you'd
> also, still, have to test on the class path, and you'd make it harder for
> application maintainers to migrate old class-path applications.  I don't
> think this idea is worth pursuing.
>
> What ideas are worth pursuing?  We should, by all means, continue to
> improve our tools, Jigsaw itself, and the rest of the JDK.  Several of
> us here collaborated on the initial support for modular development in
> Maven, but clearly there's more to do.  If nothing else, the Surefire
> plugin should be able to test a library on both the class path and the
> module path.  There's also at least one improvement in the JDK worth
> considering, which a few of us have already discussed, namely a small
> enhancement to javac that would allow a single invocation to compile a
> module, in module mode [2], but target class files other than
> `module-info.class` to an earlier release [3].
>
> To sum up: We have more work to do, based on practical experience.  This
> shouldn't be a surprise, with a change to the platform of this scope.
> Let's keep doing that work, let's not unduly alarm ourselves or anyone
> else, and please let's not throw out the baby with the bathwater.
>
> - Mark
>
>
> [1] http://blog.joda.org/2018/03/jpms-negative-benefits.html
> [2] http://openjdk.java.net/jeps/261#Compile-time
> [3] https://bugs.openjdk.java.net/browse/JDK-8200254

Reply | Threaded
Open this post in threaded view
|

Re: The baby and the bathwater

Cédric Champeau
2018-03-27 15:56 GMT+02:00 Gregg Wonderly <[hidden email]>:

> I think that Stephen is largely announcing that JigSaw presents more
> problems than it solves for the community.  My list of issues, which I have
> shared before, goes basically like this.
>
> 1. Modules should have versions which new tooling on community toolsets
> could use for managing details like maven is doing now, but without
> explicit version details available at runtime.  There should be runtime
> version details available.  Java’s dynamic binding mechanisms would allow a
> lot more flexibility in how software was assembled at runtime, and that
> would be a good thing.
>

I'm not sure what you mean by "modules should have versions". They do,
today, it's just not used, and I think it's a good thing. So I assume you
are talking about enforcing requirements on versions in the module info
file, in which case I strongly disagree. Disclaimer: I'm in the Gradle
team. We think it's the build tool system (or runtime system when runtime
plugins) do determine the right versions, given a set of constraints
provided by the producer (the library author) and the consumer. Any real
world application has version conflicts, and solving conflicts is *not* a
trivial business, so you clearly don't want the JVM to do it. What if a
library requires module X:1.0.1, but you discover a critical vulnerability
in X? Should you upgrade to 1.0.2? Where do you find this information?
Metadata is live, the only thing that the module should say, IMO, is
precisely what it does today: "I require this module", but then delegate to
the build tool the responsibility to find a version that works given all
requirements/constraints (environment, target JDK, vulnerabilities, ...). I
think what the JDK does today is the right balance. In particular,
module-info is mostly focused on the runtime aspect, but there are clear
differences between what you need for compile, for runtime, or to compile
against a component. The constraints are not necessarily the same for all
those, so they shouldn't be mixed.
Reply | Threaded
Open this post in threaded view
|

Re: The baby and the bathwater

Neil Bartlett
On Tue, Mar 27, 2018 at 3:09 PM, Cédric Champeau <[hidden email]>
wrote:

> 2018-03-27 15:56 GMT+02:00 Gregg Wonderly <[hidden email]>:
>
> > I think that Stephen is largely announcing that JigSaw presents more
> > problems than it solves for the community.  My list of issues, which I
> have
> > shared before, goes basically like this.
> >
> > 1. Modules should have versions which new tooling on community toolsets
> > could use for managing details like maven is doing now, but without
> > explicit version details available at runtime.  There should be runtime
> > version details available.  Java’s dynamic binding mechanisms would
> allow a
> > lot more flexibility in how software was assembled at runtime, and that
> > would be a good thing.
> >
>
> I'm not sure what you mean by "modules should have versions". They do,
> today, it's just not used, and I think it's a good thing. So I assume you
> are talking about enforcing requirements on versions in the module info
> file, in which case I strongly disagree. Disclaimer: I'm in the Gradle
> team. We think it's the build tool system (or runtime system when runtime
> plugins) do determine the right versions, given a set of constraints
> provided by the producer (the library author) and the consumer. Any real
> world application has version conflicts, and solving conflicts is *not* a
> trivial business, so you clearly don't want the JVM to do it. What if a
> library requires module X:1.0.1, but you discover a critical vulnerability
> in X? Should you upgrade to 1.0.2?Where do you find this information?
>

Version ranges solve this problem.


> Metadata is live, the only thing that the module should say, IMO, is
> precisely what it does today: "I require this module", but then delegate to
> the build tool the responsibility to find a version that works given all
> requirements/constraints (environment, target JDK, vulnerabilities, ...).


Whether or not JPMS enforces version constraints, the inability to even
state a dependency upon a version of a module has created duplication.You
have to state "require module" in module-info, and you have to repeat it
with additional version information in the build descriptor (pom.xml,
build.gradle, etc).

If we could put a version requirement in module-info then the build tool
could use that; this could have been achieved simply by permitting
annotations on module-info 'require' statements.


> I
> think what the JDK does today is the right balance. In particular,
> module-info is mostly focused on the runtime aspect, but there are clear
> differences between what you need for compile, for runtime, or to compile
> against a component. The constraints are not necessarily the same for all
> those, so they shouldn't be mixed.
>

In what sense is module-info focused on the runtime aspect? It is enforced
at both compile time and runtime, and yet it does not provide sufficient
information for either the build tooling OR the runtime to assemble a
consistent set of modules that work together.
Reply | Threaded
Open this post in threaded view
|

Re: The baby and the bathwater

Cédric Champeau
> Version ranges solve this problem.
>

They don't. They introduce new categories of problems (reproducibility,
boundaries) and don't account for the fact that liveliness of a version is
post-publication (a vulnerability is rarely discovered before releasing,
typically).


>  Whether or not JPMS enforces version constraints, the inability to even
> state a dependency upon a version of a module has created duplication.You
> have to state "require module" in module-info, and you have to repeat it
> with additional version information in the build descriptor (pom.xml,
> build.gradle, etc).
>

You don't have to repeat. There's nothing that says that _you_ should write
the module file. Also, the build tool _may_ source the dependencies from
the module info file (but it wouldn't be enough, because you want different
dependencies for test, compile, API, ...). Also a version is often
misleading. What does it mean when you write "I depend on 1.0.4". Does it
mean that it doesn't work on 1.0.3, or does it mean that it was the latest
version that was available when you built? Or does it mean that actually
this version is provided by your runtime environment, so it's a strict
dependency? We're currently tackling all these problems, which are real
world problems on medium to large scale applications. A single version
number is often not enough: you need constraints, and sometimes variants
(think classifiers).


>
>
>
> In what sense is module-info focused on the runtime aspect? It is enforced
> at both compile time and runtime, and yet it does not provide sufficient
> information for either the build tooling OR the runtime to assemble a
> consistent set of modules that work together.
>
> The module info file defines the module graph, and is enforced at compile
and runtime. However, it doesn't account for what you need:

- when you build your library: API and implementation dependencies
- when someone builds against your library: only API dependencies
- when you run the library (API, implementation and "runtime only"
dependencies)

Nor does it know which of does are provided by the runtime environment, or
compile tools. It only knows they are required, but barely knows who
provides them, and for what use.
Reply | Threaded
Open this post in threaded view
|

Re: The baby and the bathwater

Jochen Theodorou
In reply to this post by Gregg Wonderly
On 27.03.2018 15:56, Gregg Wonderly wrote:
> I think that Stephen is largely announcing that JigSaw presents more problems than it solves for the community.  My list of issues, which I have shared before, goes basically like this.
>
> 1. Modules should have versions which new tooling on community toolsets could use for managing details like maven is doing now, but without explicit version details available at runtime.  There should be runtime version details available.  Java’s dynamic binding mechanisms would allow a lot more flexibility in how software was assembled at runtime, and that would be a good thing.

the module system is all about not dynamically assembling applications
at runtime. JLink is the high point of that so far

> 2. The complete set of changes around JigSaw are primarily about isolating the community from the innards of the JDK implementation, which they developed reliance on due to the JCM not functioning and not caring enough about actual software development, but money making instead.  Now, they have to completely rearchitect large amounts of software that is mostly 100% reliable, to use these JDK “improvements”.  What’s the chance of that happening for anything that is not going to be actively developed because it completely works as needed?

I still find comments like "not caring enough about actual software
development, but money making instead" a bit unfair. Sure, those cases
exist. But in other cases there used to be no other way, or there have
been other needs that required doing a solution in a very stupid way.

For example, for me to be able to call constructors of a super class I
have to have an invokespecial. Since I have to decide what constructor
to call at runtime I have basically something like a switch with
multiple invokespecial and the switch-case deciding which one to take.
This is a major hack, no Java compiler would ever emit code like that.
But the Java compiler decides this at compile time, not my requirement
at all. And of course when in Java8 the verifier got rewritten, this was
no longer working for a while. But do you think there will be ever a
better solution to this with these requirements? I doubt it.
invokeSpecial has very specific semantics in this case, that are not
going to change in my favor, just because I need that feature. The
alternative is to introduce strange unrelated constructors, no longer
enabling extending classes, wrapping everything, making constraints
about the constructors... Alternatives are there... all horrible.

I can give quite a list of things that are why they are because the
software is almost 15 years old now. And I am pretty sure it would have
never existed in a world of Java 9+... And actually the last point is
what saddens me the most here.

Anyway. You did go a way did does not fit jigsaw anymore, now you have
to change the architecture - big times. So big, that there is no chance
of hiding the changes from my users. And suddenly I will have a MOP that
will have to expose do-what-you-want-with-me Lookup objects, just to get
things somehow still working.

And for those not actively developed it is easy. They will die at one
point. Not that this is good, just the effect.

> 3. It seems from the outside, that there was little attention payed to the 80% of Java users who don’t write code or use enterprise services, but instead have Java applets, WebStart clients and clickable jars, until late in the game.  This caused a derail of the projects desire to lock-it-down in JDK-9.  Now that the realities of all of this are becoming much more obvious, it’s actually much clearer, that a lot of java code runs without deployment plans or explicit lifecycle management of neither the Java software nor the JDK version used to run it.  The result is that more and more people are going to give up on using Java because they will try to use an application which has always worked for them, and it will either silently fail (clickable jar with a security violation), or scream about some kind of security violation (applet or webstart app) which will alarm them into deleting it or moving on.

enterprise services... considering how long jigsaw chose to ignore
things like dependency injection and continues comments about how jigsaw
is not for the enterprise, there is also a large portion of the
enterprise services that got no attention.

> 4. When I was a member of the Sun Developer Advisory Council in the early 2000’s, I learned a lot about how Sun was focused on just the enterprise opportunity to sell hardware and support contracts.  How did that work out?  Oracle who is also a “server” company, demonstrated again, with JigSaw, that only the enterprise platform environment with planned software schedules and controlled releases was interesting.  In SDAC parties with James Gosling, we heard stories about his frustration with the desktop environment being ignored by Sun Management.  There were a lot of “I’m sorry” and “I’m trying to work on those things” comments.  Where’s Gosling now in Java involvement?
> 5. The AWT and Swing parts of Java as well as DOM integration with Applets hasn’t been touched or worked on in decades.  The demise of the Netbeans team and other changes around various platforms and people who were serving communities outside of the enterprise platform, demonstrates that the platform is no longer viable for anything except enterprise class projects, and I’d suggest that even there, it’s questionable because of how these JDK changes are changing things that Oracle needs, but not “providing” things that the community benefits from, except for Oracle’s argument that now the JDK can actually be improved.

You forgot JavaFX. Since it will be no longer part of the JDK I did hear
very worried words from some people and others giving up JavaFX
completely. And no, they do not go to Swing and AWT. They move to
javascript UIs. That means Java on the server if at all. Nashorn isn't
even considered as an alternative to the likes of node and electron.
Desktop Java as application, Applet or Webstart will have a very very
bad standing in the future. The only reason to keep Java on the server
and not use for example Go instead is... the supporting libraries...
many of them having to do a lot of changes because of jigsaw.

I have not worked on any application in the last 4 years that will work
in that version on Java9 as module, nor will it do now.

> I know this sound extremely whiny and fussy.  I understand how it can easily be ignored.  The detail for me, is simply that after 20 years of Java, with the first 5-10 years being very exciting because of the opportunity for unifying platforms, we are now not doing that at all.  Apple has moved on with Swift, so that platform is not interested in Java by and large.  Windows has .Net, like it or not, and people use that platform because of the rich end to end integration it has with all aspects of the platform, unlike Java which is not completely focused on ignoring anything about the UX for desktop environments.  Thus, we are left with “Linux/BSD/Solaris” as the only place that Java is being used, to some degree.  Python has a lot of traction because it is lighter weight and has better platform integration.
>
> It’s sad to say these things, because Java could of been so much more.  But, instead, it’s getting to be less and less to fewer and fewer people, largely because the only focus is on the 10,000 customers who do enterprise computing, instead of the 10,000,000 developers who could really benefit from using Java.  That was on a slide at the first SDAC meeting.
>
> Java was going to be for everyone!

But somebody has to pay the development. It is not like the JVM is a
cash cow. But let me as a different question: Why should I use Java in
the cloud? Oracle fails to answer this in my opinion. And in my opinion
Oracle is on a good way to loose Java on the desktop as well as the
server. I may overestimate trends I imagine to see of course. But
frankly, the superior JVM technology and not having to compile for each
platform (I ignore jlink here) are the things keeping me around for now.
Java as platform would have to set a new positive trend to get out of
this... difficult to do.

bye Jochen
Reply | Threaded
Open this post in threaded view
|

Re: The baby and the bathwater

Peter Levart
In reply to this post by mark.reinhold


On 03/26/18 20:08, [hidden email] wrote:
> Stephen closes with a specific suggestion:
>
>    "There needs to be a way for a library author to insist that the
>     modular jar file they are producing can only be run on the module-path
>     (with any attempt to use it on the class-path preventing application
>     startup).  This would eliminate the need for testing both class-path
>     and module-path."

That's easy to enforce in runtime. Just take a "victim" class from your
library that is most often needed when your library is being used (or
take a couple of them) and add a class initialization block like the
following to them:

public class Whatever {

     static {
         if (Whatever.class.getModule().getName() == null) {
             throw new Error("Can only use this library as a module");
         }
     }


Regards, Peter


Reply | Threaded
Open this post in threaded view
|

Re: The baby and the bathwater

Remi Forax
yes,
!Whatever.class.getModule().isNamed()

Rémi

----- Mail original -----
> De: "Peter Levart" <[hidden email]>
> À: "mark reinhold" <[hidden email]>, "jigsaw-dev" <[hidden email]>
> Envoyé: Mercredi 28 Mars 2018 09:28:23
> Objet: Re: The baby and the bathwater

> On 03/26/18 20:08, [hidden email] wrote:
>> Stephen closes with a specific suggestion:
>>
>>    "There needs to be a way for a library author to insist that the
>>     modular jar file they are producing can only be run on the module-path
>>     (with any attempt to use it on the class-path preventing application
>>     startup).  This would eliminate the need for testing both class-path
>>     and module-path."
>
> That's easy to enforce in runtime. Just take a "victim" class from your
> library that is most often needed when your library is being used (or
> take a couple of them) and add a class initialization block like the
> following to them:
>
> public class Whatever {
>
>     static {
>         if (Whatever.class.getModule().getName() == null) {
>             throw new Error("Can only use this library as a module");
>         }
>     }
>
>
> Regards, Peter
Reply | Threaded
Open this post in threaded view
|

Re: The baby and the bathwater

Cédric Champeau
Although I doubt that making sure a library _only_ works on the module path
is a good idea, would it make sense to introduce a helper method for this
check? Something like:

Module.assertOnModulePath()

?

2018-03-28 9:41 GMT+02:00 Remi Forax <[hidden email]>:

> yes,
> !Whatever.class.getModule().isNamed()
>
> Rémi
>
> ----- Mail original -----
> > De: "Peter Levart" <[hidden email]>
> > À: "mark reinhold" <[hidden email]>, "jigsaw-dev" <
> [hidden email]>
> > Envoyé: Mercredi 28 Mars 2018 09:28:23
> > Objet: Re: The baby and the bathwater
>
> > On 03/26/18 20:08, [hidden email] wrote:
> >> Stephen closes with a specific suggestion:
> >>
> >>    "There needs to be a way for a library author to insist that the
> >>     modular jar file they are producing can only be run on the
> module-path
> >>     (with any attempt to use it on the class-path preventing application
> >>     startup).  This would eliminate the need for testing both class-path
> >>     and module-path."
> >
> > That's easy to enforce in runtime. Just take a "victim" class from your
> > library that is most often needed when your library is being used (or
> > take a couple of them) and add a class initialization block like the
> > following to them:
> >
> > public class Whatever {
> >
> >     static {
> >         if (Whatever.class.getModule().getName() == null) {
> >             throw new Error("Can only use this library as a module");
> >         }
> >     }
> >
> >
> > Regards, Peter
>
Reply | Threaded
Open this post in threaded view
|

Re: The baby and the bathwater

Neil Bartlett
In reply to this post by Cédric Champeau
On Tue, Mar 27, 2018 at 4:07 PM, Cédric Champeau <[hidden email]>
wrote:

>
> Version ranges solve this problem.
>>
>
> They don't. They introduce new categories of problems (reproducibility,
> boundaries) and don't account for the fact that liveliness of a version is
> post-publication (a vulnerability is rarely discovered before releasing,
> typically).
>

They do. The fact that they create further (solvable) challenges does not
mean that they are not a solution to the initial problem.

The second point is also incorrect - indeed, this is exactly the point of
using a range rather than a point version. If I depend on a range "[1.4,
2.0)", this is because I need a *feature* that was released in version 1.4
of the dependency. My module is compatible with version 1.4.0 and if there
is a bugfix called 1.4.1 then my module is also compatible with that. My
version range does not imply anything about vulnerabilities that may exist
in version 1.4.0, and nobody is suggesting that it should.


>
>

>
>>  Whether or not JPMS enforces version constraints, the inability to even
>> state a dependency upon a version of a module has created duplication.You
>> have to state "require module" in module-info, and you have to repeat it
>> with additional version information in the build descriptor (pom.xml,
>> build.gradle, etc).
>>
>
> You don't have to repeat. There's nothing that says that _you_ should
> write the module file. Also, the build tool _may_ source the dependencies
> from the module info file (but it wouldn't be enough, because you want
> different dependencies for test, compile, API, ...).
>

You keep making my point for me! Dependencies are different at compile time
from runtime... not just in terms of versions but also identities. For
example it's common practice to compile against a pure API but deploy with
an implementation of the API. The module-info in JPMS, which is enforced at
both compile time and runtime, works against that practical insight.


> Also a version is often misleading. What does it mean when you write "I
> depend on 1.0.4".
>

Well what does it mean when you write "compile 'foo:bar:1.0.4'" in your
build.gradle file? It means you have compiled against that version of the
API, and your module will be compatible with that version up to the next
breaking change (2.0 if the dependency is using semver).

If you wanted your module to be compatible with 1.0.3 then you would have
compiled against 1.0.3. Unless you do that, there is no way for any tooling
to infer that you are indeed compatible with 1.0.3.


> Does it mean that it doesn't work on 1.0.3, or does it mean that it was
> the latest version that was available when you built? Or does it mean that
> actually this version is provided by your runtime environment, so it's a
> strict dependency? We're currently tackling all these problems, which are
> real world problems on medium to large scale applications. A single version
> number is often not enough: you need constraints, and sometimes variants
> (think classifiers).
>
>
>>
>>
>>
>> In what sense is module-info focused on the runtime aspect? It is
>> enforced at both compile time and runtime, and yet it does not provide
>> sufficient information for either the build tooling OR the runtime to
>> assemble a consistent set of modules that work together.
>>
>> The module info file defines the module graph, and is enforced at compile
> and runtime. However, it doesn't account for what you need:
>
> - when you build your library: API and implementation dependencies
> - when someone builds against your library: only API dependencies
> - when you run the library (API, implementation and "runtime only"
> dependencies)
>
> Nor does it know which of does are provided by the runtime environment, or
> compile tools. It only knows they are required, but barely knows who
> provides them, and for what use.
>

If a module knows what it needs, it is not necessary to know "who" provides
it.
Reply | Threaded
Open this post in threaded view
|

Re: The baby and the bathwater

Cédric Champeau
>
> They do. The fact that they create further (solvable) challenges does not
> mean that they are not a solution to the initial problem.
>
> The second point is also incorrect - indeed, this is exactly the point of
> using a range rather than a point version. If I depend on a range "[1.4,
> 2.0)", this is because I need a *feature* that was released in version 1.4
> of the dependency.
>

This is an arbitrary interpretation of a range. In practice people use them
for very different purposes. If, when you write "[1.4,2.0)", you assume
that you need a feature of 1.4, this is already an assumption. Most people
use 1.4 as the baseline because _this is the latest version available when
I started_. They also _assume_ that anything from 1.4 would work. In
practice this is rarely the case. There are bugs introduced, there are
binary incompatibilities (despite semantic versioning). Ranges are a
_convenience_, but certainly not an answer. Ranges + locking are better,
because you _have to_ test, but they don't account for the environment
either (say, my app depends on servlet-api, which version should you use?
It should be _strictly_ what the runtime environment will give).


>
> Well what does it mean when you write "compile 'foo:bar:1.0.4'" in your
> build.gradle file? It means you have compiled against that version of the
> API, and your module will be compatible with that version up to the next
> breaking change (2.0 if the dependency is using semver).
>
>
In Gradle you'd not use `compile` anymore. You would use:

api 'foo:bar:1.0.4'

for an API dependency, one that is exposed by your very own library (mostly
maps to "requires transitive")

and you'd use:

implementation 'foo:baz:2.1.4'

for an implementation dependency, that is _not_ exposed by your API (mostly
maps to "requires").

And if a transitive dependency needs a different version, we have
strategies to select a best match, or fail.


> If you wanted your module to be compatible with 1.0.3 then you would have
> compiled against 1.0.3. Unless you do that, there is no way for any tooling
> to infer that you are indeed compatible with 1.0.3.
>

That's again an over simplification. Real world apps have different
problems. You may say "1.0.3", because 1.0.2 had a bug. Maybe your app
didn't even depend on the faulty behavior, but because it had a bug, you
upgraded. And, maybe one of your dependencies actually required 1.0.2
because 1.0.3 introduced a regression. So you want to be able to downgrade
dependencies. Gradle makes it possible. There's a big difference between
the "ideal world", and the world we live in.
Reply | Threaded
Open this post in threaded view
|

Re: The baby and the bathwater

Stephen Colebourne
In reply to this post by Peter Levart
On 28 March 2018 at 08:28, Peter Levart <[hidden email]> wrote:

> That's easy to enforce in runtime. Just take a "victim" class from your
> library that is most often needed when your library is being used (or take a
> couple of them) and add a class initialization block like the following to
> them:
>
> public class Whatever {
>
>     static {
>         if (Whatever.class.getModule().getName() == null) {
>             throw new Error("Can only use this library as a module");
>         }
>     }

Agreed that this has always been possible, but it is code not
metadata. Really, it should be a startup JPMS error if the module
isn't running in the expected mode. That way tools like Maven and
Gradle can also take decisions based on the metadata.

Stephen
Reply | Threaded
Open this post in threaded view
|

Re: The baby and the bathwater

Neil Bartlett
In reply to this post by Cédric Champeau
On Wed, Mar 28, 2018 at 10:01 AM, Cédric Champeau <[hidden email]
> wrote:

> They do. The fact that they create further (solvable) challenges does not
>> mean that they are not a solution to the initial problem.
>>
>> The second point is also incorrect - indeed, this is exactly the point of
>> using a range rather than a point version. If I depend on a range "[1.4,
>> 2.0)", this is because I need a *feature* that was released in version 1.4
>> of the dependency.
>>
>
> This is an arbitrary interpretation of a range. In practice people use
> them for very different purposes. If, when you write "[1.4,2.0)", you
> assume that you need a feature of 1.4, this is already an assumption. Most
> people use 1.4 as the baseline because _this is the latest version
> available when I started_. They also _assume_ that anything from 1.4 would
> work. In practice this is rarely the case. There are bugs introduced, there
> are binary incompatibilities (despite semantic versioning). Ranges are a
> _convenience_, but certainly not an answer. Ranges + locking are better,
> because you _have to_ test, but they don't account for the environment
> either (say, my app depends on servlet-api, which version should you use?
> It should be _strictly_ what the runtime environment will give).
>

You keep mixing up the perspective of an application (and application
assembler) with the perspective of a library/module.

As a library developer I should pick the lowest version of my dependencies
that my library can build against. When assembling an application you pick
the highest version of each module such that you have a graph that will
resolve.

Version ranges in a library indicate compatibility, they say nothing about
buggy point versions of the dependency. Yes you still need a mechanism for
locking buggy versions but you store that information outside the module
descriptor (because we cannot know about buggy versions that may be
released in the future). That locking mechanism is in the domain of
application assembly.


>
>
>>
>> Well what does it mean when you write "compile 'foo:bar:1.0.4'" in your
>> build.gradle file? It means you have compiled against that version of the
>> API, and your module will be compatible with that version up to the next
>> breaking change (2.0 if the dependency is using semver).
>>
>>
> In Gradle you'd not use `compile` anymore. You would use:
>
> api 'foo:bar:1.0.4'
>
> for an API dependency, one that is exposed by your very own library
> (mostly maps to "requires transitive")
>
> and you'd use:
>
> implementation 'foo:baz:2.1.4'
>
> for an implementation dependency, that is _not_ exposed by your API
> (mostly maps to "requires").
>
> And if a transitive dependency needs a different version, we have
> strategies to select a best match, or fail.
>

Good to know. And how do you transform that information into module-info?
You talked about generating module-info but it sounds like you would need
two of them... one with foo.bar for compile (otherwise javac will barf) and
the other with foo.baz for runtime (otherwise the runtime resolver will
barf).


>
>
>> If you wanted your module to be compatible with 1.0.3 then you would have
>> compiled against 1.0.3. Unless you do that, there is no way for any tooling
>> to infer that you are indeed compatible with 1.0.3.
>>
>
> That's again an over simplification. Real world apps have different
> problems. You may say "1.0.3", because 1.0.2 had a bug. Maybe your app
> didn't even depend on the faulty behavior, but because it had a bug, you
> upgraded. And, maybe one of your dependencies actually required 1.0.2
> because 1.0.3 introduced a regression. So you want to be able to downgrade
> dependencies. Gradle makes it possible. There's a big difference between
> the "ideal world", and the world we live in.
>
>
>
I love being told that I don't live in the real world, and that the
problems I (and many others) have been solving for over a decade are
insoluble :-)

Downgrading with version ranges is of course possible so long as they are
used properly, but they also protect you from downgrading SO far that you
get problems like NoSuchMethodError, NCDFE, etc.
Reply | Threaded
Open this post in threaded view
|

Re: The baby and the bathwater

Cédric Champeau
>
> You keep mixing up the perspective of an application (and application
> assembler) with the perspective of a library/module.
>
>
It's interesting that you say this because we precisely value modeling
applications and libraries differently, using different plugins. However
not everybody does that and while we can think of _ideal_ ways to model
things. the truth is that most people don't reason like that. They use
Maven (or Ant), templates that generate projects for them, use `+` as their
dependency versions, BOMs to "suggest versions". or think that there's no
difference between a "compile" and "test" scope so we can put all
dependencies in a single json file shared by the whole company. So we're
not arguing about what the _ideal_ solution should be. We're arguing about
the interpretation of version numbers, and what people expect.


>
>
As a library developer I should pick the lowest version of my dependencies

> that my library can build against. When assembling an application you pick
> the highest version of each module such that you have a graph that will
> resolve.
>
> Version ranges in a library indicate compatibility, they say nothing about
> buggy point versions of the dependency. Yes you still need a mechanism for
> locking buggy versions but you store that information outside the module
> descriptor (because we cannot know about buggy versions that may be
> released in the future). That locking mechanism is in the domain of
> application assembly.
>

That's precisely the point. When, as a library author, you write: [1.0,
2.0), did you mean:

- I tested all versions from 1.0 to 2.0 (excluded), and they work (they are
_compatible_) or
- I tested all versions from 1.0 to 1.4, because 1.4 was the latest
available, and they work, and I suppose it's going to be true for anything
up to 2.0 or
- I tested with 1.0, and hopefully any higher version should work (most
likely what you intend to say) or
- You can build me with any version between 1.0 and 2, and it should
compile fine or
- You can build me with 1.0, and run with any superior version, should run
fine
- ...

The reality is that 99% of developers don't make any difference between all
this, they just took the latest version available, and built against it.


>
>>
>>
>
> Good to know. And how do you transform that information into module-info?
> You talked about generating module-info but it sounds like you would need
> two of them... one with foo.bar for compile (otherwise javac will barf) and
> the other with foo.baz for runtime (otherwise the runtime resolver will
> barf).
>

Currently we don't do any generation. It's an option to do it, and I
wouldn't say that it's the best one, I think it's still too soon to make
the decision. Especially, module info contains _additional_ information,
like services, that the build tool probably doesn't care about (again
arguable, we could potentially model services too). Another option is to
source dependencies from module-info, but it's not that simple (parsing,
mapping to the appropriate configurations, ...). So while it's a bit
annoying to have redundancy between dependencies declared in the build file
and those declared in the module-info file, there *is* interest for both.
In particular, what you build might not just be a single library. You might
want to share dependencies between modules, and make sure they use the same
versions. You might want to produce a platform definition (BOM) too. The
things we produce are different from the things we need.

>
> I love being told that I don't live in the real world, and that the
> problems I (and many others) have been solving for over a decade are
> insoluble :-)
>
>
It's not about not living in the real world or not. It's about the horrible
truth of the hundreds of modules published on Maven Central that use
hundreds of different conventions, both in versioning or publishing. And
recognizing things like "you shouldn't have 2 slf4 bindings on your
classpath". There's no silver bullet, so I don't think putting versions in
module info would solve this, on the contrary, it would probably make
things much harder for lots of people.
12