Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

File references #9

Open
flaxed opened this issue Aug 10, 2019 · 11 comments
Open

File references #9

flaxed opened this issue Aug 10, 2019 · 11 comments
Labels
feature Feature suggestion question Further information is requested

Comments

@flaxed
Copy link
Contributor

flaxed commented Aug 10, 2019

There are a myriad of reasons to split type definitions across several files, which unfortunately raise the issue of how to point at the files containing the type definitions being used.

Please comment on possible solutions, past experiences, successes, and also very important, failures.

@flaxed flaxed added question Further information is requested feature Feature suggestion labels Aug 10, 2019
@leiradel
Copy link

At Insomniac we used the mcpp so we could #include other files and avoid having the parser locate and parse other files to get all the definitions, but in the end I think it wasn't worth the trouble because:

  1. We had to use an external tool to be able to parse the DDL files
  2. We had to keep track of which definition was in which file using the #line directive, otherwise we would generate code for the definitions in the included files
  3. We had to pass flags to mcpp so it could find the included files
  4. Although I prefer a C-like syntax for a DDL, having a C pre-processor may feel too much alien for programmers with background in different programming languages

One benefit though is having all the power of a C pre-processor available for use in the DDL files.

I was recently thinking in different ways to add references to other files in an in-house DDL, and thought about requiring fully qualified IDs for types defined in other files, but didn't have the time to explore it.

It was something like this:

struct particle {
    math::vec3 position;
    math::vec3 velocity;
};

math::vec3 would be looked up in <path_to_root_of_ddl_files>/math/vec3.ddl, but I don't like having to declare one structure per file.

@flaxed
Copy link
Contributor Author

flaxed commented Aug 18, 2019

I agree with the avoid #include sentiment. Besides being C-centric, I'm not a fan of the issues it causes if you want to move the referenced files around.

Meanwhile I've been thinking a path include might not be needed at all. In the initial discussions a project system topic was debated, or lack of, where it will probably end up being the responsability of the pipeline to pass the tooling with a list of files to be included.
This approach, with the namespace support #6, makes me think we can support your fully qualified name suggestion, with a few more things, like using statements.

Lets assume you have a foo/math/vec/vec.ddl file:

namespace Math;

def struct Vec3
{
    x: float,
    y: float,
    z: float,
}

with your particle example we can do:

using math;

struct particle {
    vec3 position;
    vec3 velocity;
};

And we could pass a list of namespaces to be included automatically to the libraries and tools, allowing us to use vec3 position directly.

@leiradel
Copy link

I dislike using but that's coming from a large code base where it did nothing good but add confusion.

where it will probably end up being the responsability of the pipeline to pass the tooling with a list of files to be included.

The pipeline won't know which files a particular DDL file will need, and we don't want to pass all existing DDL files every time the tooling is invoked.

My two favorite ways to deal with this:

  1. By convention, where the type Math::Vec3 would be looked up in <path_to_root_of_ddl_files>/Math/Vec3.ddl
  2. By using explicit imports, where import Math.Vectors; will make the tooling load the definitions in the <path_to_root_of_ddl_files>/Math/Vectors.ddl

The difference between #1 and #2 is that with #1 each type will have to be defined in its own file, where with #2 a DDL file can have multiple type definitions.

Both options need all DDL files to exist under a common root folder, which I wanted to hate but don't. This root folder would be a parameter to the tooling.

@flaxed
Copy link
Contributor Author

flaxed commented Aug 19, 2019

Do you have examples of those confusions? Always useful to know the shortcomings of other implementations.

I think this discussion is starting to invade the scope of the project system, whatever that will be, so might be better to create a new issue for this :) (Issue #11)

Regarding the convention approach I have to argue against it. This project aims to create both a file syntax and tooling to support its use. If we implement any convention that goes against the existing implementation of a project, that is one less project that can benefit from this, and by consequence less value this project brings to the community.

But I like way #2. Doesn't force a file convention, and works in similar ways to other namespaced languages.
And it doesn't require explicit imports in the files for every type. This method still allows for the user to pass a list of implicit imports to the tooling, like a prelude in a way.
If we use using Math::Vectors or import Math.Vectors will probably end up an implementation detail, I like that we have common ground on leveraging the namespaces for referencing other files.

@leiradel
Copy link

Do you have examples of those confusions?

Nothing special, just a developer that used using std; in some files and brought tons of stuff to the global namespace just to avoid typing std:: where necessary, which required some maintenance.

But I like way #2.

+1

@flaxed
Copy link
Contributor Author

flaxed commented Aug 19, 2019

I think I see the concern. I'm assuming the DDL tooling on that project somehow added the using std; to the generated C++ file.
From my perspective this isn't the concern of the DDL tooling, but purely on the codegen side of things. So a using, or import, or some other alternative, should only inform the DDL parser where to look for the type, and not be used on the generated files.

@leiradel
Copy link

Sorry, I was under the impression that you were suggesting that DDL supported using math; to avoid having to type the fully qualified ID of types like math::vec3.

@flaxed
Copy link
Contributor Author

flaxed commented Aug 22, 2019

Ah, right. That too. Was mixing both functionalities, without even noticing it, were the two functionalities are:

  • Picking the right file\namespace\package from the name
  • Bringing the types in that source into scope

Might not be a good idea to do a 2-in-1, but maybe we can. We could paraphrase Rust use.
Let's assume a .ddl with

use SomeFancyLib::math;

def struct Entity {
    position: math::vec3,
}

For this file, the reference system (from #11 (comment)) takes SomeFancyLib::math and returns /SomeFancyLib/math.ddl.

Then the type system maps any type prefixed with math:: to the scope of the file.

Seems simple enough to me, and hopefully sorts out your dislike for using.

@leiradel
Copy link

I'm used to read using as C++'s using, and import as Java's import. use meaning import confuses me.

@flaxed
Copy link
Contributor Author

flaxed commented Aug 23, 2019

The keyword itself can be using, use, import, include, etc. Any of those brings their own assumptions from the languages that already use them.

Important is to figure out the behavior. Is there a benefit of separating things with two statements like C++ does, or can we keep things simple and do the 2-in-1 above?

@leiradel
Copy link

Less magic is better. I would never thing about implementing something like C++'s using.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature Feature suggestion question Further information is requested
Projects
None yet
Development

No branches or pull requests

2 participants