A general achievable rate region and two certain capacity regions for Slepian–Wolf multiple-access relay channel with non-causal side information at one encoder

Two-user multiple-access relay channel (MARC) with side information (SI) non-causally known at one encoder is studied, where transmitters send a common message with rate R 0 , and private messages with rates R 1 and R 2 ; and derive a general achievable rate region by appropriately using superposition block Markov encoding, binning scheme, Gel’fand– Pinsker (GP) coding and partial decode-forward (PDF) strategy at the relay. Also, it is shown that this capacity inner bound can be tight for two special classes of degraded and reversely degraded multiple-access relay channels with one informed encoder. In order for the theoretical ﬁndings to be potentially applicable in communications performance metrics evaluations, the obtained general achievable rate region, while having all of the previous works on the point to point, relay and multiple access channels with and without SI as its special cases, is extended to the corresponding Gaussian version, the dirty paper coding of which has been a challenging and widely studied problem, and the result of which includes a variety of previously studied important Gaussian results. Finally, mathematical results are evaluated numerically.


INTRODUCTION
Shannon [1] studied the point to point channel in the presence of causal side information. The non-causal SI at the transmitter was introduced in [2] and the corresponding channel capacity was obtainded by Gelfand-Pinsker (GP) [3] and Costa extended the GP result to the Gaussian version by dirty paper coding (DPC) [4]. In [5], the authors analysed the effect of the correlation between channel input and SI as a general description for DPC, In [7], the authors investigated the multi-layer coding over a dirty-paper channel. The effect of side information (causal or non-causal) known at the source(s) and destination on the capacity region of the multiple-access channel (MAC) was studied in many literatures. A class of time-varying multiple-access channel (TVMAC) with state known to encoders and decoder was studied in [6]. Jafar [8] studied the capacity region of point to point channel and MAC with independent SI causally and non-causally known at the source(s).
and [24] have studied the MAC with SI using lattice strategy, and fading MAC, respectively. The relay channel (RC), first introduced by Van der Meulen [25], was studied for a variety classes of RC such as degraded, reversely degraded, full feedback and Gaussian degraded relay channels by Cover and El Gamal, also, they defined decode and forward (DF) and estimate or compress and forward (EF or CF) strategies at the relay; finally, they obtained a capacity upper bound and a general capacity lower bound for RC [26]. Then, the RC has been studied widely in the literature [27][28][29][30][31][32][33], and also, with SI in [34][35][36].
The multiple-access relay channel (MARC) where a relay cooperates with some transmitters to communicate with the destination was introduced in [37]. Researchers extensively studied the capacity inner bound for MARC with coding strategies of DF, CF and amplify and forward (AF) in [38], [39] and [40]. A new achievable rate region for Slepian-Wolf MARC was derived in [41], also, the capacity inner bounds for discrete and memoryless MARC with partial decode and forward (PDF) strategy and regular block Markov coding/backward decoding with and without non-causal SI known at one transmitter were obtained in [42] and [43], respectively. The capacity inner bound for DM-MARC with non-causal SI known at the relay and its extension to Gaussian case was studied in [44]. Also, the MARC in the presence of SI at both cooperating encoders and the MARC with relay-source feedback, and the transmission of analog information over MARC in [45][46][47], respectively, and the MIMO state-dependent channel in which only the helper node has a non-causal knowledge of the state in [48] were studied.

Our motivation and work
As shown in previous works, the non-causal SI can improve the rate region of the point-to-point channel, MAC and MARC. The MARC with PDF strategy at the relay and without any SI in [41] and [42] and with SI at the relay [44] has been studied. The MARC with a common message at the transmitters, SI at one encoder and PDF strategy at the relay as a general communication channel, including (Slepian-Wolf MAC, widely studied MAC with SI at one encoder, the RC with PDF strategy at the relay, and point to point channel with SI) has not been studied before.
In this paper, this general channel is analysed from viewpoint of the achievable rate region. Specifically, we consider a twouser multiple-access relay channel with non-causal side information only at one encoder, where, the uninformed and informed encoders transmit a common message with rate R 0 and a private message with rates R 1 and R 2 , respectively. To obtain the capacity inner bound for the MARC three different strategies can be used: irregular encoding successive decoding, regular encoding backward decoding and regular encoding sliding-window decoding, none of which has been proved to be optimal. In [43], regular block Markov encoding and backward decoding have been used to study the effect of non-causal side information available at one encoder on the achievable rate region of discrete memoryless MARC, however, in the present paper, the irregular block Markov encoding and successive decoding are used to achieve a capacity inner bound by appropriately applying binning scheme, superposition coding, Markov block encoding and Gel'fand-Pinsker method. As easily seen, in the presence of side information non-causally known at one encoder, none of these achievable rate regions includes the other. Also in this paper, for the first time, we obtain the capacity region for two special classes of degraded and reversely degraded two-user multiple-access relay channels in the presence of non-causally side information known non-causally at one encoder, in addition, our discrete alphabet results have extended to the continuous alphabet Gaussian MARC, using the extended version of dirty paper coding in two layers for PDF strategy, which is useful for analysing the role of side information and the relay in communications performances, such as transmission rates, energy efficiency, coverage region etc. Additionally in this paper, showing the effect of side information on the capacity inner bound, comparisons with the previous works and numerical simulations have been provided.

Paper organisation
The rest of the paper is organised as follows: In Section 2, the two-user discrete memoryless multiple-access relay channel with non-causal side information only at one encoder is described and we prove a capacity inner bound. In Section 3, capacity regions for two special classes of MARCs with one informed encoder are obtained. In Section 4, the results are extended to the Gaussian version. In Section 5, the numerical results are presented. Finally, we conclude the paper in Section 6.

Notations
In the whole of this paper, uppercase letters and lowercase letters are used to represent random and their realisations, respectively. The probability mass function (pmf) for random variables X and Y , where x ∈  and y ∈ , are denoted by p X (x) and p Y (y), respectively. Also, the conditional pmf X given Y is represented by p X |Y (x|y). The n-sequences random variable respect to p X (x) is shown by X n △ = (X 1 , X 2 , … , X n ) and the set of all -typical n-sequences X n is represented by A n .

A GENERAL CAPACITY INNER BOUND FOR DM-MARC WITH SI NON-CAUSALLY KNOWN AT ONE SOURCE
Two-user DM-MARC consists of one relay which aids two sources to communicate with one destination as shown in Figure 1. The relay improves the communication performances between transmitters and receiver such as coverage region and transmission rate. Different coding strategies can be employed at the relay such as DF, PDF, CF or AF.  In PDF coding strategy, the relay decodes some parts of the messages, but in DF strategy, the relay decodes the whole message sent by sources. The PDF and DF can be optimal when the relay-sources channel is excellent.
• Two source encoders x n 1 (m 1 ) and x n 2 (m 2 , s n ) where m 1 ∈ M 1 and m 2 ∈ M 2 .
The average probability of error is defined as The region of all (R 0 , R 1 , R 2 ) are achievable if P (n) e → 0, for some (2 nR 0 , 2 nR 1 , 2 nR 2 , n) code.
The best and general strategy for the relay is partial decode and forward (PDF), where, the relay decodes whole or part of the message(s) and cooperates with encoders to communicate the message(s) to the destination. Hence, we use block Markov encoding and random binning in which the capacity inner bound can be obtained by b transmission block, each consisting of n transmissions, where, the relay cooperates in transmitting messages by sending the bin index. Note that the average rate over the b blocks is R(b − 1)∕b, which can be made as close to R as desired. The encoders use superposition coding, enabling them to transmit both the common and individual messages, the informed encoder non-causally knowing the side information uses Gel'fand-Pinsker coding, resulting in canceling the effect of interference in the Gaussian version by dirty paper coding, and the relay and the receiver do the decoding processes in general ways.
We use the binning scheme, Markov block coding, superposition coding to prove the theorem. In Markov block coding the In the binning scheme, the relay helps encoders with sending the bin index L j of the messages (M 1 j , M 2 j ) in block j + 1.
We split the private messages  i for i = 1, 2 with rates R i into two parts  ′ i and  ′′ i with rates R ′ i and R ′′ i , respectively. Now, we prove a theorem to obtain a general inner bound for the capacity region of MARC with PDF strategy at the relay and side information (SI) at one encoder. The theorem, while having all known previous results as its special cases (as explained in the corollaries, below), shows well the effects of the SI and the relay with PDF strategy in comparison with the corresponding special channels; and can be extended to the continuous alphabet Gaussian version in order for the practical aspects of the results to be clarified more.
where the union is taken over: and satisfying: Proof. See Appendix A.1. □ Remark 1. Due to the distribution of p(s, x R , u 0 , u 1 , u 2 , x 1 , v) in (8), it can it can be easily seen that Hence the following inequalities (noting to the inequality I (X ; Y |Z ) ≥ I (X ; Y ) for independent Z and X ) for both relay and decoder (L = R, D) are derived which are used in detail of the proof.
Remark 2. In Theorem 1, we studied the capacity inner bound for MARC with non-causal full SI known at the informed encoder; however, it can be seen obviously that, we can extend our results to imperfect SI known noncausally at the informed encoder by transforming S toS in Equations (1)- (11).

Corollaries of Theorem 1:
Now, we compare our capacity inner bound with previous works and show that our result includes all previous regions. Corollary 1. The achievable rate region for the Slepian-Wolf multipleaccess channel [49] can be obtained by setting: [50] can be derived by setting: [15], can be achieved by setting: X R = ∅ ,

Corollary 4. The achievable rate region for MAC with non-causal side information at informed source in which both encoders transmit a generic message and the individual message is transmitted with informed source
[51], can be obtained by setting: Corollary 5. The inner bound of the two-user MAC with SI only at one transmitter where the uninformed source's message is known at the informed source [16], can be obtained by setting: U 0 = X 2 , U 2 = U , Corollary 6. It can be shown, the achievable regions for the two-user MAC with SI only at one source where only uninformed node sends an individual message [17], is obtained by setting: Corollary 7. Two lower bounds of DM-RC with non-causal side information known at the source were derived in [35], In which the source transmits two layers of the state description to relay and destination. As can be easily seen the inner bound of [35, Theorem 1,

Corollary 8. The inner bound of multiple access relay channel [37], can be obtained by setting: S
Corollary 9. It is obviously seen that our inner bound is reduced to a general achievable rate region in [41] by setting: S = ∅ and V = X 2 .

CAPACITY REGIONS FOR TWO SPECIAL CLASSES OF DEGRADED AND REVERSELY DEGRADED MARCs WITH ONE INFORMED TRANSMITTER
The general capacity region for multiple-access relay channel or even relay channel with or without side information is an open problem. However, in this section, we present two special capacity regions for multiple-access relay channel in the presence of side information known non-causally only at one encoder.
We consider a two-user multiple-access relay channel with SI at the informed transmitter such that all channels between encoders and relay are better than the channels between transmitters, or in the other words, known as degraded MARC. As the channel between encoder to relay is better than to decoder, the relay recovers whole the messages, hence, we use the DF strategy.

Theorem 2.
Consider the specially degraded MARC where, the uniformed encoder transmits only the common message and the informed encoder transmits the common and private messages. The capacity region for this channel is the convex hull of all rates (R 1 , R 2 ) satisfying: With constraints: where, U 2i = W 0 W 2 S n i+1 Y i−1 D , and the union is taken over: Proof. See Appendix A.2. □ Also, we consider a two-user reversely degraded multipleaccess relay channel with only one informed encoder where encoders transmit only individual messages, and the relay uses PDF strategy.
Theorem 3. Consider a multiple-access reversely degraded relay channel with non-causal SI at one encoder where the relay decodes just part of the messages and the rest of the messages are decoded only at the destination. The encoders transmit only private messages with rates ( . The capacity of this channel is the convex hull of rates (R 1 , R 2 ) satisfying: where the union is taken over: And we define,

THE GAUSSIAN MARC WITH NON-CAUSAL SIDE INFORMATION AT ONE ENCODER (EXTENSION OF THE THEOREM 1 TO THE GAUSSIAN VERSION)
The Gaussian rate region is useful for computing practical communications performances. Due to these applications, in this section, we consider a Gaussian MARC, with SI at one transmitter and derive the corresponding capacity inner bound. The outputs of the channel are: where X n 1 is the signal transmitted by the uninformed encoder with average power constraint ∑ n i=1 X n 1,i < nP 1 , X n 2 is the informed encoder signal with average power constraints ∑ n i=1 X n 2,i < nP 2 . X n R is the signal of the relay with average power constraint ∑ n i=1 X n R,i < nP R , S n is the side information known at the informed encoder and it is assumed that S is zero-mean Gaussian random variable with variance Q s , however, despite the Costa work, it is assumed to be dependent on X 2 . Z R and Z D are independent normal random variables with zero mean and variance N R and N D , respectively. g R 1 , g R 2 , g D 1 , g D 2 , g RD , g SR and g SD are positive constants representing the static gains of the links.
The distribution of inputs, side information and auxiliary random variables in (8) can be extended to linear continuous version all of which are explained in the proof of Theorem 4.

Theorem 4. The achievable rate region for Gaussian MARC with one informed encoder can be found as:
where ) , for discrete alphabet and memory-less MARC with SI at one encoder, the parameters are channel outputs (Y D , Y R ), channel inputs (X 1 , X 2 , X R ), side information S , auxiliary random variables (U 1 , U 2 , U 0 , V ) characterising the relay strategy and different parts of messages. The rate regions are described by various mutual information between these variables, where, the role of each variable such as S is seen in the rate regions and easily interpreted. (ii) In Theorem 4 for the Gaussian MARC (continuous alphabet version) in addition to the Y D , Y R , X 1 , X 2 , X R , U 1 , U 2 , U 0 , V , noise variables in the relay and the receiver are added; and due to the dependency of Gaussian entropy, and hence the mutual information, to the intended variables variances, in (23)-(29), we have channel gains, average powers and corresponding correlation coefficients, where, in every term the role of each of these parameters is seen and can be interpreted.
Remark 4. (i) Consider the Gaussian multiple-access relay channel with partial side informationS at one informed encoder satisfying (21) and (22). We assume S =S + E S and E S ∼ It can be easily shown that the capacity inner bound for MARC with imperfect SI at informed encoder can be obtained by trans- (23)- (29) in Theorem 4, and hence showing the effect of imperfectness of side information (ii) In relations (23)- (29) for Gaussian MARC rate region, impacts of correlations between random variables (e.g. SI and input X 2 as 2s ) are seen obviously.

Corollaries of Theorem 4
A variety of previous works for Gaussian channels is included in the Theorem 4 as special cases.
Corollary 11. The capacity inner bound for Gaussian point to point channel with side information non-causally known at source [5, case1] can be obtained by omitting relay and uninformed encoder and following condition: And obviously, a special case of [5] and by 2s = 0, Costa result [4] can be obtained.

Corollary 12.
The achievable rate region for Gaussian MAC with the common message in [52] can be obtained by omitting relay and side information at the encoder and the following condition: Corollary 13. The achievable rate region for Gaussian MAC [53,54] can be obtained by omitting relay, common message and side information at encoder:

Corollary 14.
The achievable rate region for Gaussian MAC with state known to one encoder and independent messages in [15] can be obtained by omitting the relay and the following condition: Corollary 15. It can be shown that the achievable rate region for Gaussian MAC with state known to one encoder (only uninformed encoder transmits private message) in [16] can be obtained by omitting the relay and the following condition:

NUMERICAL RESULTS
The obtained discrete alphabet results can be extended, by using linear relations between channel input-output (relations (21) and (22) in Section 4), to continuous alphabet versions, Gaussian versions with constant channel gains (affected by path loss) and wireless block fading version with random channel coefficients (affected by general path loss, shadowing and multipath fading). In this paper, we have studied the Gaussian version (Theorem 4 in Section 4) assuming the gains to be static and functions of free space path-loss dependent on the distance between transmitter and receiver (Figure 2), modelled by , (where is the free space loss exponent factor and d R 1 is the distance between the uninformed encoder and the relay); and generally by (d − i j ), where is the free-space loss exponent factor and d i j is the distance between node i = 1, 2, R, S and node j = R, D.
In this section, we compare our results with the previous papers for MARC with or without SI. As can be seen in the rates of Theorem 4 (as explained in Remarks 3,4), it is shown that the informed encoder can cancel some part of the interference effect and can achieve better rates. In Figure 3, we are depicted our inner bounds R 2 , R 1 + R 2 and R 0 + R 1 + R 2 , when  Comparison between our capacity inner bounds (R 2 , R 1 + R 2 , R 0 + R 1 + R 2 ) versus relay power which has been studied in [41]. It can be seen that the informed encoder can be canceled some part of the interference effect. In Figure 4, we compare the inner bounds R 2 , R 1 + R 2 and R 0 + R 1 + R 2 for our channel, when  Our capacity inner bound (R 2 ) is compared with one (R 2 ) in [43] previous sections, none of which has been proved to be optimal. Figure 6, illustrated a situation that the coding used in this paper is better than coding studied in [43].

CONCLUSION
In this paper, we have derived a general achievable rate region for discrete alphabet and memoryless two-user multiple-access relay channel, where only one of the encoders knows noncausally the side information, the relay utilises the PDF strategy and decodes just part of the message and both encoders transmit common and private messages. The obtained general capacity inner bound was shown to include all the previous results, emphasising the effects of SI and relay PDF strategy. There is no known capacity region for MARC with or without side information; here, we have shown that our capacity inner bound can be tight for two special classes of multiple-access relay channels with non-causal side information known at only one source. Finally, we have extended our discrete alphabet theorem to the Gaussian case (with constant channel gains), for clarifying the practical aspects of the theorem, and proved that previously studied various important Gaussian results are all special cases of our results; certainly, the derived discrete alphabet results can be extended to continuous alphabet block fading wireless versions, the same as for Gaussian case, however, with random channel coefficients. The obtained theoretical results have been compared numerically with two of the previous papers.  (10) . l: follows since the channel is degraded. m: follows since the channel is reversely degraded. n: follows since A.1 Proof of theorem 1 Codebook generation: Fix the pmf p(s, x R , u 0 , u 1 , u 2 , x 1 , v) in (8) and function x 2 (s, x R , u 0 , u 2 , v): 1) Generate 2 nR independent identically distributed n-sequence x n R , each according to p(x n R ) = conditionally and independently n-sequence v n , each drawn according to p(v n |x n R (m), u n 0 ( j, m), u n 2 (l 2 , l 2 , j, m)) = ∏ n i=1 p(v i |x R,i (m), u 0,i ( j, m), u 2,i (l 2 , l 2 , j, m)), index them as v n (q 2 , q 2 ,l 2 , l 2 , j, m), q 2 ∈ [1 : 2 nR ′′ 2 ],q 2 ∈ [1 : 2 n(I (V ;S |X R ,U 0 ,U 2 )) ]. 7) Partition the set of messages, w ′ k , k = 1, 2 into 2 nR equal size parts randomly and name the partitions of u n k as (m), m ∈ [1 : 2 nR ]. 8) Partition the set of messages, (w ′ 1 , w ′ 2 ), into 2 nR equal size parts randomly and name the partitions of (u n 1 , u n 2 ) as (m), m ∈ [1 : 2 nR ]. 9) Partition the set of messages, (w 0 , w ′ 1 , w ′ 2 ), into 2 nR equal size parts randomly and name the partitions of (u n 0 , u n 1 , u n 2 ) as (m), m ∈ [1 : 2 nR ].

Encoding in source terminals:
be the new messages to be sent in B + 1 blocks. Split the messages to B equally sized blocks. Transmitter 1 sends x n 1,b (w ′′ 1,b , w ′ 1,b , w 0,b , m b−1 ) and transmitter 2 with knowing S n , finds u n 2,b (l 2,b , w ′ 2,b , w 0,b , m b−1 ) such that: if more than one sequence exists, it picks the sequence with the minimum number and if no such sequence exists, it picks, if more than one sequence exists, it picks the sequence with the minimum number and if no such sequence exists, it picks,q 2,b = 1. Transmitter 2 sends , s n ).
Encoding and decoding at the relay: After the transmission of block b is completed, the relay has seen y n R,b . The relay tries to findŵ 0,b ,ŵ ′ 1,b andŵ ′ 2,b such that: And then the relay transmits x n R,b (m b ). Decoding at the receiver: After block b + 1, the receiver has seen y n D,b and y n D,b+1 , and tries to findm b ,w 0,b ,w ′ 1,b , and Error analysis at the relay: Assume without loss of the generality w 0,b = w ′ 1,b = w ′ 2,b = m b−1 = 1 and letL 2,b denotes the sequence satisfying (A.2). Hence the probability of the relay error is: where } , Error analysis at the receiver: Assume without loss of generality that, 1) and letL 2,b andQ 2,b denotes the sequence satisfying (A.1) and (A.2), respectively. Hence the probability of error at the destination is bounded by: V n (q 2,b , 1,l 2,b , 1, 1, 1), S n ) ∉ A n , It can be shown that the probability of error tends to zero when n is large enough and if the following bounds are satisfied.

A.2
Proof of theorem 2 For brevity, only some bounds are proved. The other bounds can be done similarly.

A.4 Proof of theorem 4
The results of theorem 1 can be extended to Theorem 4. The distribution in Theorem 1 can be extended to the linear continuous alphabet channel, that is, the Gaussian version, enabling us to generate inputs and auxiliary random variables as follows: Where, in order for the rates to be maximised, the inputs should be Gaussian random variables: X R ∼  (0, P R ), X 1 ∼  (0, P 1 ), X 2 ∼  (0, P 2 ), , W ∼  (0, P 0 ), , M 1 ∼  (0, 1 P 1 ), M 2 ∼  (0, P 2 ), and X R , M 1 , N 1 , M 2 and W are mutually independent random variables and 0 ≤̄= 1 − ,̄1 = 1 − 1 ,̄1 = 1 − 1 ≤ 1. The informed encoder exploits a more general Gaussian DPC (GDPC). Hence, the aware source input and the known side information are correlated with parameter −1 ≤ 2s ≤ 1. By splitting the private message in two parts, we generate U 2 V as the total private message which should be decoded completely just by destination and U 2 as part of the message is decodable by both relay and destination. Also, we define: Hence: Also, it can be seen that: ) . Also: where, we defineỸ D = g D 2X 2 + g SD S + Z D andỸ R = g R 2X 2 + g SR S + Z R . Hence, it can be shown that: ( 1 +̄1 1 )P 1 + g 2 ( 1 +̄1 1 )P 1 + g 2 And the proof is completed.