It depends.
In an ideal world, each Rez package is self-contained. They each contain both the metadata like version and requirements to other packages, and the payload - like the whole Maya install.
That’s how it works for the vast majority of packages, and is the recommended workflow for Rez in general. However, because Maya is quite large, what most people do is let Maya reside locally, as a regular install, and have the package reference that install. The advantage is that you save space, network traffic and ultimately time spent loading Maya off a network; at the expense of having to install Maya locally on each machine, and of having this package not be self-contained. Not being self-contained means you can technically resolve a profile using this Maya package, but whether it works depends on the local environment.
So ideally, you would want every package self-contained - and some do self-contain even Maya and even larger packages (depending on how much space you have, how quick your network is and how complicated it is to deploy software like Maya locally, using e.g. Ansible/Chef/Puppet). But if you can’t (like most) then you would install Maya as-per usual, and make a package similar to the one you’ll find in the Allzpark Demo
The profiles for each project is independent of either option, and you could change your mind at any point without affecting the profile (instead affecting network traffic etc like mentioned) and would look something like…
alita/package.py
name = "alita"
requires = [
"~maya-2018",
]
kingkong/package.py
name = "kingkong"
requires = [
"~maya-2019",
]
If you do decide to self-contain Maya (and other DCC), then you could also employ the Allzpark localisation feature (see Localisation on landing page) to synchronise your networked Maya package with a local copy for local performance. Best of both worlds.