The maximum size a single chunk when using chunked transfer encoding. This is only a theoretical maximum used to detect errors in clients, it is highly unlikely to encounter clients that send more than several kilobytes at once.
The maximum size of the body as specified by Content-Length. This is only a
theoretical maximum, the actual limit is subject to the limits of the file
system used for Dir.tmpdir
.
Kcar::Parser.new => parser source
Creates a new parser.
call-seq:
parser.reset => parser
Resets the parser so it can be reused by another client
parser.body_bytes_left => nil or Integer source
Returns the number of bytes left to run through #filter_body. This will initially be the value of the "Content-Length" HTTP header after header parsing is complete and will decrease in value as #filter_body is called for each chunk. This should return zero for responses with no body.
This will return nil on "Transfer-Encoding: chunked" responses as well as HTTP/1.0 responses where Content-Length is not set
parser.body_bytes_left = Integer source
Sets the number of bytes left to download for HTTP responses with "Content-Length". This raises RuntimeError for chunked responses.
parser.body_eof? => true or false source
Detects if we're done filtering the body or not. This can be used to detect when to stop calling #filter_body.
parser.chunked? => true or false source
This is used to detect if a response uses chunked Transfer-Encoding or not.
extract_trailers (hdr) source
extract trailers that were set in the header object as an array of arrays
parser.extract_trailers(hdr) => [ [ 'Content-MD5', '1B2M2Y8AsgTpgAmY7PhCfg==' ] ]
parser.filter_body(buf, data) => nil/data source
Takes a String of data
, will modify data if dechunking is
done. Returns nil
if there is more data left to process.
Returns data
if body processing is complete. When returning
data
, it may modify data
so the start of the
string points to where the body ended so that trailer processing can begin.
Raises ParserError if there are dechunking errors. Basically this is a
glorified memcpy(3) that copies data
into buf
while filtering it through the dechunker.
parser.headers(hdr, data) => hdr or nil source
Takes a Hash and a String of data, parses the String of data filling in the Hash returning the Hash if parsing is finished, nil otherwise When returning the hdr Hash, it may modify data to point to where body processing should begin.
Raises ParserError if there are parsing errors.
parser.keepalive? => true or false source
This should be used to detect if a request can really handle keepalives and pipelining. Currently, the rules are:
MUST be HTTP/1.1 or
HTTP/1.0 with "Connection: keep-alive"
MUST NOT have "Connection: close" set
If there is a response body, either a) Content-Length is set or b) chunked encoding is used
parser.reset => parser source
Resets the parser so it can be reused by another client
parser.headers(hdr, data) => hdr or nil source
Takes a Hash and a String of data, parses the String of data filling in the Hash returning the Hash if parsing is finished, nil otherwise When returning the hdr Hash, it may modify data to point to where body processing should begin.
Raises ParserError if there are parsing errors.
mail archives: https://bogomips.org/kcar-public/ public: kcar-public@bogomips.org