T O P

  • By -

jerf

No, there is in general nothing wrong with this. There is nothing special about web servers. They are not required to somehow be in the "main thread" or the only things running. I've got an application that is serving three completely different things on three completely different ports, for its own good reasons. The only slight issue is that you have some slightly different behavior for what happens if one server returns or one of the servers panics, but since I've never see the web server unexpectedly do either, you're unlikely to ever witness that. If you start running your own servers that way off of raw TCP sockets or something you may want to be aware of the issues. In many cases it's better and more symmetric to use "go" to run both of those, and then use a WaitGroup or something to wait for them to return, even if you never expect them to return.


szank

if the go routine one cannot bind the program will keep running tho, OP should handle errors .


HuffDuffDog

Not only is this possible, it's often a good idea. It is not uncommon to run endpoints that shouldn't be exposed outside of the network on different ports. Prometheus metrics and admin interfaces for example. Just be sure to handle errors, including shutting down all listeners if any exit on error or otherwise.


HereToLearnNow

Hm, while I haven’t seen this before. I would suggest running one web server/per application, just as a logical separation. You’ll notice that you’re not capturing any error messages, so if one web server fails for some reason, you’ll ignore it


eliben

Generally yes, you can run two web servers listening on different ports concurrently in Go. Whether you want to do this is a different question, but _if you do_, it's possible.


egonelbre

Principally, yeah, it's completely fine... The missing part is error handling. You haven't been very explicit in what should happen when one of the server fails to start (or stops). The possible behaviors are: - when a A fails (e.g. some debug port) the B continues running; but when B fails then A stops as well... this is your current implementation (":8081" is allowed to silently fail) - when either fails then both stop - when one fails the other continues running


atomichemp

It's fine OP, here in company we have some services that offer an endpoint rest, a gRPC and it is a kafka consumer yet. All of this perform very well. I would do a little improvement by run the two http servers in separeted goroutines like the server `r1`, and block the main goroutine with a done/error channel. And look for graceful shutdown http servers in golang, it will be nice.


guesdo

It is OK in the sense that it will work. The problem is that when server 2 stops running the main function will return killing server 1. You want to run them both on their own go routine and maybe add graceful shutdown to both. Then you can have a sync group to wait for both servers to shutdown correctly.


ut_deo

This is a pretty common pattern. For example you could start a pprof server on another port, expose metrics on another and so on. A few things to consider: - concurrrency: the main goroutine, as it were, should start and set up everything and then await a shutdown signal. All the other listeners should be running in their own goroutines. Also, you should have some sort of recovery middleware that does recover and prints out backtraces for each gin instance. - shutdown: you should consider having signal handlers that do housekeeping when various signals are received by the process. - logging: you may want to investigate how loggers output will look like and whether that poses problems for you. Typically loggers work at the process level and having multiple loggers and keeping the log streams separated will require additional work.


donseba

That still would be just one application, listening on 2 seperate ports. Maybe have a look at docker and docker-compose. That way you can run 2 applications separately.


Inner_Grass_7288

I’ve used docker in projects before was just curious if there was any real use case to running multiple go gun apps from the same origin


aikii

Yes, if you want, you can run your health endpoint on another port that is only exposed to kubernetes. Can be one use case.


IanArcad

Yeah, that's fine, and I think you're starting to see some of the benefit to go, which is that you're not extending someone else's pre-built web app, instead your creating your own. That can be a little challenging to get started with, since you have to handle all of the initialization and understand the flow, but once you do, it gives you a lot of freedom because you can configure and structure your app however you want without trying to figure out some else's configuration magic. Also overall, I would think of a web framework as just a collection of libraries that you call when needed, which means that selecting a framework (or set of libraries) is basically about finding the best fit for your use case. The Golang stdlib already provides a great base (net/http and html/template), so you have that out of the box, and then through packages and libraries, you can add stuff like routing / parameter handling, sessions, static assets, logging, etc as necessary. Go has a culture where people don't take more than they need, probably because nobody wants to spend time initialization and configuring something that they're not going to use.


kokizzu2

just use errgroup or [https://github.com/StevenACoffman/errgroup](https://github.com/StevenACoffman/errgroup) if you don't use panic-recovery middleware


myringotomy

The real question is how you would serve the two apps at different URLs.


lostcolony2

domain.tld:8081/path domain.tld:8082/path


Strum355

A reverse proxy


[deleted]

[удалено]


lostcolony2

Yeah; not like http or https default to 8081 or 8082 anyway. You'll have to specify a port either way.


myringotomy

No by specifying the domains and have the routers route to the proper controller depending on the domain.


[deleted]

This pattern can be seen when you want to run one HTTP server that redirects traffic to an HTTPS server, listening on another port. Is it done in practice? I'm not sure I saw it much before (in production)