If I have independent variables [x1, x2, x3] If I fit linear regression in sklearn it will give me something like this:
y = a*x1 + b*x2 + c*x3 + intercept
For generating polynomial features, I assume you are using sklearn.preprocessing.PolynomialFeatures
There's an argument in the method for considering only the interactions. So, you can write something like:
poly = PolynomialFeatures(interaction_only=True,include_bias = False)
poly.fit_transform(X)
Now only your interaction terms are considered and higher degrees are omitted. Your new feature space becomes [x1,x2,x3,x1*x2,x1*x3,x2*x3]
You can fit your regression model on top of that
clf = linear_model.LinearRegression()
clf.fit(X, y)
Making your resultant equation y = a*x1 + b*x2 + c*x3 + d*x1*x + e*x2*x3 + f*x3*x1
Note: If you have high dimensional feature space, then this would lead to curse of dimensionality which might cause problems like overfitting/high variance