Yes, you should also be able to convert an inception model to TFLITE. You only need the checkpoints if the graph is not yet frozen. If the graph is already frozen (what I assume), you should be able to convert it with the following command:
bazel run --config=opt //tensorflow/contrib/lite/toco:toco -- \
--input_file=**/path/to/your/graph.pb** \
--output_file=**/path/to/your/output.tflite** \
--input_format=TENSORFLOW_GRAPHDEF \
--output_format=TFLITE \
--inference_type=FLOAT \
--input_shape=1,299,299,3 \
--input_array=**your_input** \
--output_array=**your_final_tensor**
(you have to replace the text between the asterisks with the arguments that applies to your case; --inputs=Mul for example)
Note on --inputs=Mul
Some of the TF commands used in Inception v3 are not supported by TFLITE (decodejpeg, expand_dims), since they typically do not have to be adopted by the model on the mobile phone (these tasks are done directly in the app code). Therefore you have to define where you want to hook into the graph with TF Lite.
You will probably get the following error message without using input_array:
Some of the operators in the model are not supported by the standard TensorFlow Lite runtime. If you have a custom implementation for them you can disable this error with --allow_custom_ops. Here is a list of operators for which you will need custom implementations: DecodeJpeg, ExpandDims.
I hope I could help you. I'm just struggling with converting retrained graphs around.