Socket_setInterface on Linux always returns SOCKET_ERROR
allanvaughanjones opened this issue · 1 comments
allanvaughanjones commented
I think the logic is wrong in Socket_setInterface.
`
int Socket_setInterface(SOCKET sock, char* interface_name, int family) {
int rc = SOCKET_ERROR;
#if other os
....
#else
if (setsockopt(sock, SOL_SOCKET, SO_BINDTODEVICE, (void*)interface_name, strlen(interface_name) + 1) == -1)
{
rc = Socket_error("SO_BINDTODEVICE", 0);
Log(LOG_ERROR, -1, "Could not set SO_BINDTODEVICE for socket %d %d\n", sock, rc);
}
#endif
return rc;
`
Should this be
`
rc = setsockopt(sock, SOL_SOCKET, SO_BINDTODEVICE, (void*)interface_name, strlen(interface_name) + 1);
if (rc != 0)
{
rc = Socket_error("SO_BINDTODEVICE", 0);
Log(LOG_ERROR, -1, "Could not set SO_BINDTODEVICE for socket %d %d\n", sock, rc);
}
`
icraggs commented
Yes, the return code should be set. Thanks.