0
votes

So my problem is this: I usually connect my laptop to an external screen, and everything works fine, until I need to bring my laptop to the university for a progress report. When my laptop is not connected to an external screen, the window showed by imshow is cut off at the bottom, as shown in this picture.

Has anyone encountered this problem before? If so, can you share how to fix this? My laptop uses windows 10, python 3.6.4 and opencv 3.3.0.

My current code is a bit long so I will just give the flow here: Read an image with imread, crop the ROI from it, convert the ROI to gray scale with cvtColor, add a mask to filter out colors with cv2.inRange, apply that mask on the gray scale image with bitwise_and, then do a perspective change with getPerspectiveTransform and warpPerspective, save the images with imwrite, finally showing the results with imshow and waitkey.

Below are sample lines of code for each step.

Bottom part is cut off

'image' and 'gray' are 480x360 images, 'cropped' and 'white_lanes' are 480x150 images, and mapped is a 480x550 image

My code:

image = cv2.imread(args["data"])

gray = cv2.cvtColor(image, cv2.COLOR_BGR2GRAY)

cropped = gray[210:360]

mask_white = cv2.inRange(cropped, 220, 255)

white_lanes = cv2.bitwise_and(cropped, mask_white)

M = cv2.getPerspectiveTransform(pts1, pts2)

mapped = cv2.warpPerspective(white_lanes, M, (wlwidth+2*expansionx, wlheight+expansiony))

filename=args["data"][11:-4]

cv2.imwrite("cropmap{}.jpg".format(filename), cropmap)

cv2.imshow("Cropped Gray Image", cropped)

cv2.waitKey(0)
1
As your code is not provided... a simple hint towards your answer: use a form of image autoresizer. Your image size seems to be coded static and not dynamic. Keep proportions in mind. And use "!" infornt of your imagelink... it will then show it to us ;pZF007
Hi @ZF007, I edited the post as you suggested. Yes, my image sizes used static codes and the proportions of the images showed by imshow are not the same.Bao Tran

1 Answers

0
votes

In your code you show cv2.imshow("Cropped Gray Image", cropped) but not cv2.imshow("Mapped Image", mapped) which clearly is show in your posted image.

You perform with mapped = cv2.warpPerspective(white_lanes, M, (wlwidth+2*expansionx, wlheight+expansiony)) a task but doesn't show dimensions in your code. Thus hard to tell if it fits within your laptop screensize.

Thus add...

resized_mapped_image = cv2.resize(mapped, (0,0), fx=0.5, fy=0.5))

.. then cv2.imshow("Mapped Image", resized_mapped_image)

..which should downsize image by half and solve the issue.

Otherwise...:

  1. how does your image array size looks just before M = cv2.getPerspectiveTransform(pts1, pts2) ?
  2. Check that for each subsequent step...
  3. Also, after cropped you show image... but the image shown shows values on the x-axis... which doesn't make sense with cv2.imshow("Cropped Gray Image", cropped) as cv2.imshow("Mapped Image", mapped) should be replaced with cv2.imshow("Cropped Gray Image", cropped).. but then its only the cropped version shown....