I am writing an optimization program in Julia using JuMP. My Julia Version is 1.3.1, JuMP version is 0.21.2.
One of my variables is a matrix, which is a convenient structure in my case.
using JuMP
using LinearAlgebra
c1H = Vector(1:3)
model = Model()
@variable(model, GiH[1:10, 1:3] >= 0)
test = rand(10,3)
In the objective function I multiply the matrix by a (parameter) vector, and then sum the entries of the resulting entries. I want to write it like this:
@objective(model, Min, sum(GiH*c1H))
which is equivalent to
@objective(model, Min, ones(10)'*(GiH*c1H))
This runs fine when I replace the variable matrix by the number matrix test
.
However, I with variable matrix GiH get the error
MethodError: no method matching similar(::Array{Float64,1}, ::Type{GenericAffExpr{Float64,VariableRef}}, ::Array{Int64,1})
Closest candidates are:
similar(::Array{T,1}, ::Type) where T at array.jl:331
similar(::Array, ::Type, !Matched::Tuple{Vararg{Int64,N}}) where N at array.jl:334
similar(::AbstractArray, ::Type{T}) where T at abstractarray.jl:626
...
*(::JuMP.Containers.DenseAxisArray{VariableRef,2,Tuple{Array{Int64,1},Array{Int64,1}},Tuple{Dict{Int64,Int64},Dict{Int64,Int64}}}, ::Array{Float64,1}) at matmul.jl:51
top-level scope at rewrite.jl:227
top-level scope at macros.jl:762
What is going on? It seems that matrix multiplication is not defined for jump variable matrices?
I know I can replace this matrix multiplication by nested sum(... for ...)
but I want to know if it is possible to do it differently.