Skip to content

Reading a huge data flow #42

@maxim133

Description

@maxim133

Hello! I have found strange behavior when I was reading a long data flow. If the data size more than readBufferSize_B_ (default value is 256) it will not receive other part of the data.
There are two ways to resolve it:

  1. Using a cycle. For example:
    void SerialPort::Read(std::string& data) {
        PortIsOpened(__PRETTY_FUNCTION__);

        // Read from file
        // We provide the underlying raw array from the readBuffer_ vector to this C api.
        // This will work because we do not delete/resize the vector while this method
        // is called
        while (ssize_t n = read(fileDesc_, &readBuffer_[0], readBufferSize_B_)){
                // Error Handling
            if(n < 0) {
                // Read was unsuccessful
                throw std::system_error(EFAULT, std::system_category());
            }
            else if(n == 0) {
                // n == 0 means EOS, but also returned on device disconnection. We try to get termios2 to distinguish two these two states
                struct termios2 term2;
                int rv = ioctl(fileDesc_, TCGETS2, &term2);

                if(rv != 0) {
                    throw std::system_error(EFAULT, std::system_category());
                }
            }
            else if(n > 0) {
                data += std::string(&readBuffer_[0], n);
            }
        }

        // If code reaches here, read must of been successful
    }
  1. Return 'n'. User will be responsible for cycling reading.

What do you think about it?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions