676: Document how to raise the mysql packet size

lukat*****@gmai***** (Google Code) (Is this you? Claim this profile.)
Oct. 12, 2012
What's the URL of the page containing the problem?
post-review command line tool

What steps will reproduce the problem?
1. Use the post-review tool to upload a diff for review

What is the expected output? What do you see instead?
The expected output is a new review should be posted.  Instead, I get an error stating
that the uploaded diff could not be uploaded.  Relevant output from -d

>>> HTTP POSTing to 
http://dev.meebo.com/codereview/api/json/reviewrequests/19/diff/new/: {'basedir': 
'/web/branches/communityim/im-gateway/ig'}
Error uploading diff: One or more fields had errors (105)
>>> {u'fields': {u'path': [u'The diff file is empty']}, u'stat': u'fail', u'err': {u'msg': u'One or more 
fields had errors', u'code': 105}}
Your review request still exists, but the diff is not attached.

What operating system are you using? What browser?
on linux, using svn version 1.5X

Please provide any additional information below.
 I looked at the diff generated by post-review vs. the one created w/ svn diff (works if I upload it 
manually).  The output differed in the position of the newlines.  Removing 'split-lines' from the 
do_diff function fixed the problem for me.  (Also removed the --diff-cmd=diff)
chipx86
#1 chipx86
Sorry, I know this is a bit old, but do you still have a good repro case for this and
can give an example of the newline issue?
  • +NeedInfo
david
#3 david
  • -NeedInfo
    +Incomplete
#4 kelvin.*******@gmai***** (Google Code) (Is this you? Claim this profile.)
This issue appears to be triggered by a very large diff request, and a mysql setting preventing the sql 
statement from being accepted.

I'm using reviewboard 1.0rc2, Perforce (2007.2), MySQL 5.0.

>>> Uploading diff, size: 1593200
>>> HTTP POSTing to http://reviews.<snip>/api/json/reviewrequests/13/diff/new/: {}
Error uploading diff: One or more fields had errors (105)
>>> {u'fields': {u'path': [u'(1153, "Got a packet bigger than \'max_allowed_packet\' bytes")']}, u'stat': u'fail', 
u'err': {u'msg': u'One or more fields had errors', u'code': 105}}
Your review request still exists, but the diff is not attached.

Note the size of the diff: 1,593,200 bytes.
Next, the max_allowed_packet setting in mysql:
mysql> show variables like "max_allowed_packet";
+--------------------+---------+
| Variable_name      | Value   |
+--------------------+---------+
| max_allowed_packet | 1048576 | 
+--------------------+---------+
1 row in set (0.00 sec)

The page at 
http://dev.mysql.com/doc/refman/5.0/en/packet-too-large.html
says:

"Both the client and the server have their own max_allowed_packet variable, so if you want to handle big 
packets, you must increase this variable both in the client and in the server."

I'll try changing just the server side, but I don't know if this will be sufficient based on what the mysql link 
above says.

chipx86
#5 chipx86
If we can't fix this, we should at least doc it.
  • -Incomplete
    +Confirmed
  • +Milestone-Release1.1
    +Component-Docs
david
#6 david
  • +Document how to raise the mysql packet size
david
#7 david
  • -Type-Defect
    +Type-Enhancement
#8 eva***@gmai***** (Google Code) (Is this you? Claim this profile.)
Diffs that huge are unreviewable anyway, and probably binary, so there's not really a
good reason to store them in the DB.  Here's a quick patch to skip over them.
  • +
    --- a/src/ReviewBoard/reviewboard/reviewboard/diffviewer/parser.py
    +++ b/src/ReviewBoard/reviewboard/reviewboard/diffviewer/parser.py
    @@ -42,6 +42,7 @@ class DiffParser(object):
             self.files = []
             file = None
             i = 0
    +        fileData = "" # accumulates all diff lines for a file
     
             # Go through each line in the diff, looking for diff headers.
             while i < len(self.lines):
    @@ -49,19 +50,31 @@ class DiffParser(object):
     
                 if new_file:
                     # This line is the start of a new file diff.
    +                if file:
    +                    file.data += self._check_diff_size(file, fileData)
    +                fileData = ""
                     file = new_file
                     self.files.append(file)
                     i = next_linenum
                 else:
                     if file:
    -                    file.data += self.lines[i] + "\n"
    -
    +                    fileData += self.lines[i] + "\n"
                     i += 1
     
    +        if file:
    +            fil
chipx86
#9 chipx86
  • +Milestone-Release1.5
chipx86
#10 chipx86
  • -Milestone-Release1.5
    +Milestone-Release1.5.x
chipx86
#11 chipx86
Pushing these out, since they won't happen for 1.5.x. The focus is on getting 1.6 out right now.
  • +Milestone-Release1.6
chipx86
#12 chipx86
Pushing lower-priority docs to to 1.6.x.
  • +Milestone-Release1.6.x
david
#13 david
  • +EasyFix
#14 ench*****@gmai***** (Google Code) (Is this you? Claim this profile.)
will look into this.
david
#15 david
Pushed to master (3882cee). Thanks!
  • -Confirmed
    +Fixed