Do any languages specify package requirements in import / include statements?

foobarbecue | 40 points

In perl, you can do

    use Some::Library v1.2.3;
and there's assorted tooling that will turn that into some sort of[0] centralised dependency spec that installers can consume.

Also e.g. https://p3rl.org/lib::xi will automatically install deps as it hits them, and the JS https://bun.sh/ runtime does similar natively (though I habitually use 'bun install <thing>' to get a package.json and a node_modules/ tree ... which may be inertia on my part).

Th e perl use is a pure >= thing though, whereas I believe raku (née perl6) has

    use Some::Raku::Library v1.*.*;
and similar but I'm really not at all an expert there.

[0] it's perl so there's more than one although META.json and cpanfile are both supported by pretty much everything I recall caring about in the past N years

mst | 6 months ago

Go goes at least part-way there. https://golangbyexample.com/go-mod-tidy/ https://matthewsetter.com/go-mod-tidy-quick-intro/ You write your module source. You then run go mod tidy. This reads your sources for imports and automatically creates the go.mod and go.sum files What's nice about this is that it ensures reproducible builds, so you should add those files to your revision control repo.

drweevil | 6 months ago

Deno can directly import in TS/JS from URLs, really nice for small 'shell scripts', but has some considerable downsides for bigger projects:

https://deno.com/blog/http-imports

PS: also Godbolt's C/C++ compilers can directly #include from URLs, I guess they run their own custom C preprocessor over the code before passing it on to the compilers:

https://www.godbolt.org/z/6aTKo4vbM

flohofwoe | 6 months ago

Cargo people are working on single-file "Rust script" support that would allow embedding the necessary parts of the manifest (Cargo.toml) directly in the .rs file as special doc comments (so things are transparent to rustc): [0]

[0] https://rust-lang.github.io/rfcs/3424-cargo-script.html

Sharlin | 6 months ago

In F# scripts you can add dependencies directly from Nuget like so:

    #r "nuget: NodaTime, 3.2.1"

    open NodaTime

    let now = SystemClock.Instance.GetCurrentInstant()

    printfn "%A" now
greener_grass | 6 months ago

Groovy has Grapes. https://docs.groovy-lang.org/latest/html/documentation/grape...

Groovy is a Java variant that runs on the JVM that allows you to add dependencies as annotations. I believe it uses Maven in the back-end, but it's just so convenient for scripts etc.

Pet_Ant | 6 months ago

Not exactly what you're talking about, but `uv` lets you specify dependencies in the header of scripts: https://docs.astral.sh/uv/guides/scripts/#declaring-script-d...

I think what you describe really only makes sense for a single file script. I _do not_ want to manage dependency hell within my own source files.

philomath_mn | 6 months ago

The Raku Programming language allows one to specify the required version, the required authority and API level:

use Foo::Bar:ver<0.1.2+>:auth<zef:name>:api<2>;

would only work if the at least version 0.1.2 of the Foo::Bar module was installed, authored by "zef:name" (basically ecosystem + nick of author), with API level 2.

Note that modules can be installed next to each other that have the same name, but different authorities and different versions.

Imports from modules are lexical, so one could even have one block use one version of a module, and another block another version. Which is handy when needing to migrate date between versions :-)

lizmat | 6 months ago

Python’s PEP 723 (Inline script metadata) has a section summarising why they couldn’t take this approach under “Why not infer the requirements from import statements?”

https://peps.python.org/pep-0723/#why-not-infer-the-requirem...

atenni | 6 months ago

Javascript does. For example:

  <script src=" https://cdn.jsdelivr.net/npm/lodash@4.17.21/lodash.min.js "></script>
Or using the new ES6 "import" syntax:

  <script type="module"> import lodash from 'https://cdn.jsdelivr.net/npm/lodash@4.17.21/+esm' </script>
iforgot22 | 6 months ago

You are mixing up program build arguments and program build parameters. In much the same way that a function has arguments, then you substitute actual parameters when calling it; you should view your build as having arguments, "imports", and parameters, "specific package versions", that you pass to the corresponding import.

Specifying a specific package version in your source directly would be like having a function with arguments, then removing one of those arguments and replacing it with a local variable with the same name that you hardcode to a specific value. It is a perfectly fine thing to do if that argument really should only ever have that specific value, but it is a fairly "fundamental" source code change; your function has fewer arguments and a hardcoded value now!

To be fair, as far as I am aware, no commonly used language seems to understand the distinction and syntactically distinguishes build arguments from build parameters. What you should have is a specific syntactic operation that specifies a argument that takes type "package", a separate, distinct syntactic operation that instantiates a specific package and binds it to a name, and a separate distinct syntactic operation that passes in a package instance to a argument. Then your build system is just instantiating specific packages and passing them as arguments to your files with imports.

In your case, you would then just be instantiating specific packages and assigning them to a "common" name. You would have no "imports" in this sense as you have no arguments, only "local variables", and thus the build system would need to do nothing as there are no "arguments" to your file. That or your build system still instantiates the packages, but as "global variables" assigned to names of your choosing, that you would then just reference in your contained file.

Veserv | 6 months ago

ruby's bundler has an "inline" mode for this, it's mostly meaningful for single-file scripts, as you thought.

https://bundler.io/guides/bundler_in_a_single_file_ruby_scri...

riffraff | 6 months ago

This is Deno (JavaScript runtime). Package version and download location in the import path.

    import * as jose from 'https://deno.land/x/jose@v5.9.6/index.ts'
new_user_final | 6 months ago

See Python PEP 722 – Dependency specification for single-file scripts

https://peps.python.org/pep-0722/

benji-york | 6 months ago

Yes! In Roc you specify all of your packages in your main.roc file, there's never a need for an external config file. It is extremely nice to always be able to run stand alone files. https://www.roc-lang.org/ An example: https://github.com/isaacvando/rtl/blob/main/rtl.roc#L1

isaacvando | 6 months ago

Dhall has imports from URLs, much like Javascript. From their tutorial:

  {- Need to generate a lot of users?
  
     Use the `generate` function from the Dhall Prelude
  -}
  
  let generate = https://prelude.dhall-lang.org/List/generate
  
  {- You can import Dhall expressions from URLs that support
     CORS
  
     The command-line tools also let you import from files,
     environment variables, and URLs without CORS support.
  
     Browse https://prelude.dhall-lang.org for more utilities
  -}
  
  let makeUser = \(user : Text) ->
        let home       = "/home/${user}"
        let privateKey = "${home}/.ssh/id_ed25519"
        let publicKey  = "${privateKey}.pub"
        in  { home, privateKey, publicKey }
  
  let buildUser = \(index : Natural) ->
        {- `Natural/show` is a "built-in", meaning that
           you can use `Natural/show` without an import
        -}
        makeUser "build${Natural/show index}"
  
  let Config =
        { home : Text
        , privateKey : Text
        , publicKey : Text
        }
  
  in  {- Try generating 20 users instead of 10 -}
      generate 10 Config buildUser
thomastay | 6 months ago

Deno does this.

They ended up adding dependency files later on to make it possible to keep package versions in sync without changing every file on every version change.

ruduhudi | 6 months ago

A while ago I have implemented something similar to that in Python, although specifying versions requires using function calls instead of imports. Turns out in Python you can execute arbitrary code during imports via hooks, including calling out to pip to install a dependency.

https://github.com/miedzinski/import-pypi

mdzn | 6 months ago

Common LISP seems like it would be a prime language to do something like this. Since one of the core tenets of the language is that all code is data, no?

That said, I'm not sure it is exactly what you are asking. They still somewhat expect you to have the system defined in a central spot as far as what all you depend on. Which I think you are almost always going to want.

That is, if you want individual files to be able to specify dependencies, that is probably doable; but you are almost certainly going to want something to pull those up to a central spot. If only so that you can work with it as a cohesive unit?

I get that you can feel these are somewhat redundant. But they are also meaningful? You could, similarly, not use any imports on a Java program and just fully qualify names as you use them. Such that imports could similarly be seen as redundant/unnecessary. At some point, you will want a place to say what names can and cannot be used in execution. Which ultimately puts you back in the same game.

taeric | 6 months ago

>it often feels to me that the dependency requirements list in pyproject.toml, requirements.json, maven.xml, CMakeLists.txt, contains information that is redundant to the import or include statements at the top of each file.

It doesn't. The name that you use for a third-party library in software generally isn't remotely enough information to obtain it, and it would be bad to have an ecosystem where it were - since you'd be locked in to implementations for everything and couldn't write software that dynamically (even at compile time) chooses a backend. On the other hand, many people need to care about the provenance of a library and e.g. can't rely on a public repository because of the risk of supply-chain attacks. Lockfiles - like the sort described in the draft PEP 751 (https://peps.python.org/pep-0751/), or e.g. Gemfile.lock for Ruby) - include a lot more information than a package name and version number for that reason (in particular, typically they'll have a file size and hash for the archive file representing the package).

>It seems to me that a reasonable design decision, especially for a scripting language like python, would be to allow specification of versions in the import statement (as pipreqs does) and then have a standard installation process download and install requirements based on those versioned import statements.

It's both especially common for naive Python developers to think this makes sense, and especially infeasible for Python to implement it.

First off, Python modules are designed as singleton objects in a process-global cache (`sys.modules`). Code commonly depends on this for correctness - modules will define APIs that mutate global state, and the change has to be seen program-wide.

Even if the `import` syntax, the runtime import system and installers all collaborated to let you have separate versions of a module loaded in `sys.modules` (and an appropriate way to key them), it'd be impractical for different versions of the same module to discover each other and arrange to share that state. Plus, library authors would have to think about whether they should share state between different versions of the library. There are probably cases where it would be required for correctness, and probably cases where it must not happen for correctness. And it's even worse if the library author ever contemplates changing that aspect of the API.

Second, there's an enormous amount of legacy that would have to be cleaned up. Right now, there is no mapping from import names to the name you install - and there cannot be, for many reasons. Most notably, an installable package may legally provide zero or more import names.

I wrote about this recently on my blog: https://zahlman.github.io/posts/2024/12/24/python-packaging-... (see section "Your package name that ain't what you `import`, makes me frustrated").

Third, Python is just way too dynamic. An `import` statement is a statement - i.e., actual code that runs when the code does, a step at a time, not just some compile-time directive or metadata. It can validly be anywhere in the file (including within a function - which occasionally solves real problems with circular imports); you can validly import modules in other ways (including ones which bypass the global cache by default - there are good system architecture reasons to use this); and the actual semantics can be altered in a variety of ways (the full system is so complex that I can't even refer you to a single overall document with a proper overview).

> For example, you have to figure out what happens if different versions of a requrement are specified in different files of the same package (in a sense, the concept of "package" starts to weaken or break down in a case like that).

As I hope I explained above, it's even harder than you seem to think. But also, this would be the only real benefit. If you want to have multiple files that always use the same version of a library, then it makes no sense to specify that version information repeatedly. (Repeating the import statement itself is valuable for namespacing reasons.)

> But in some cases, e.g. a single-file python script, it seems like it would be great.

Please read up on PEP 723 "Inline script metadata" - https://peps.python.org/pep-0723/. It does exactly what you appear to want for the single-file case - but through comment pragmas rather than actual syntax - and is supported by Pipx and uv (and perhaps others - and is in scope for my own project in this general area, Paper).

> Has anyone hacked or extended python / setuptools to work this way?

Setuptools has nothing to do with this and is not in a position to do anything about it.

zahlman | 6 months ago

Looking through https://dbohdan.com/scripts-with-dependencies, I see

- JS: Deno and Bun

- Scala: Scala CLI and Ammonite

- Python: fades (https://github.com/PyAr/fades)

epage | 6 months ago

Scala has Ammonite (https://ammonite.io/#IvyDependencies)

    import $ivy.`com.lihaoyi::scalatags:0.7.0 compat`, scalatags.Text.all._

    val rendered = div("Moo").render
svieira | 6 months ago

Scrapscript[0] comes to mind:

> Any chunk of the language can be replaced with a hash.

> These chunks are called “scraps”.

> Scraps are stored/cached/named/indexed in global distributed “scrapyards”.

[0]: https://scrapscript.org/

codethief | 6 months ago

Some do (Go), but it is really not the best idea. You do want to have one central place where you specify:

- version constraints - source of packages (you may want to host them yourself one day) - any additional metadata (some packages have options or features)

fiedzia | 6 months ago

For finl, I’m expecting that imports/package requirements will be one and the same, but that only works because as a document language, there is a single source file. The basic idea is that a user would write, e.g.,

     \LoadExtension{provider:dependency:1.0}
or

     \DocumentFormat{ams:ams:1.0}
to load the specific version of an extension or format, with this then used to fetch the code from a remote repository if a copy isn’t already cached.

But for something like loading a dependency for Java or rust, I don‘t think something like this would make sense. Or maybe I’m just too accustomed to the Maven way of doing things.

dhosek | 6 months ago

It would be great to eliminate manifest files but Dependency Confusion, supply chain attacks and malicious project takeovers are a huge security challenge right now.

grajaganDev | 6 months ago

TCL’s “package require” statement doesn’t go out and fetch the packages from the ‘net—they have to be installed locally. But it does let you specify constraints on the version of the package to use, everything from use any version after 1.2.3 to use any version between 1.2.3 and 2.3.4 to use only version 1.2.3.

wduquette | 6 months ago

You can get a similar effect in Elixir scripts with Mix.install/2: https://github.com/wojtekmach/mix_install_examples

It supports git-links as well as package names in repos and versions.

Also happens to be a very nice and capable scripting platform, with a reasonably small runtime.

cess11 | 6 months ago

Yes! Small scripts should live in a single, self contained file.

In Firefly, you can specify dependencies at the top of the file:

https://www.firefly-lang.org/

When your project grows, you can move the dependencies to a single shared project file.

continuational | 6 months ago

Go supports this with `go run`, e.g.

    # Example 1: Running the latest version
    go run github.com/rakyll/hey@latest

    # Example 2: Running a specific version
    go run github.com/rakyll/hey@v0.1.4
quesomaster9000 | 6 months ago

In Java I use jbang https://www.jbang.dev/

agilob | 6 months ago

Yes, I feel the same way. Anything less than having the compiler/interpreter automatically resolve `import <URL>` is idiotic and a massive waste of everyone's time.

"Modules" are a stupid over-engineered concept that is significantly worse in every way to includes that can be qualified/namespaced.

The fact that so many programmers defend this pointless nonsense is one of the many reasons I can't take this industry seriously anymore. "Engineers" overcomplicate loading a text file.

thuanao | 6 months ago

Another disadvantage would be that you'd have to run the code to figure out the dependencies, no?

n_plus_1_acc | 6 months ago