How to migrate a multi-instance LSB script to systemd?
1
vote
1
answer
385
views
I wrote an LSB init script that can manage multiple instances of my daemon:
rcfoo start
starts all the instances (which are found in some /etc
configuration file), rcfoo stop
stops all the instances, rcfoo status
displays the status of all instances, and rcfoo reload
reloads updates the daemon with a changed configuration.
First I wonder how to detect the instances to work on with some foo@.service
systemd unit file. AFAIK I must specify all the instances like foo@A
, foo@B
, and so on.
Second my LSB script can report an extended status, meaning it can display whether a service reload
is needed (and my reload
actually optimizes to only reload the services that need it). How can I make a custom status report? I think a script has to use systemd-notify
for custom status messages.
Fortunately my final extension to the LSB script, namely manipulating single instances by adding single
(like in rcfoo start single A
), is supported out-of-the-box with systemd.
So my basic question is the first one.
Asked by U. Windl
(1715 rep)
Mar 20, 2019, 02:14 PM
Last activity: May 16, 2019, 11:22 AM
Last activity: May 16, 2019, 11:22 AM